Yellow Fang
Legendary Member
- Location
- Reading
I read in a CompTIA training book that the manufacturers of DVDs decided to redefine a gigabyte decimally, i.e. as 1,000,000,000 bytes instead of 2 raised to the power 30, used by the rest of the IT industry. They also tried to redefine the 2 to the power 30 number as a gigibyte. Where do they get off? A byte is computing term after all (although it would make more sense to me to define memory in multiples of bits instead of bytes). Next they'll be saying a byte is ten bits and insisting everyone else calls their eight bit bytes octets or something.