On Tue, May 22, 2007 at 01:02:15PM -0600, Stephen John Smoogen wrote:
20 years ago, Megabit was 2^20 bits (Mb) and Megabyte was 2^20 bytes (MB).
What? All network performance numbers where always given in Mbps (Megabits per seconds, 10^6).
The SI (ISO?) redid the units later to deal with the fact that Mega has a scientific definition of 10^6. This also allows the Hard-drive conspiracy to undersell you the number of bits on a disk. Nowadays, Mb is supposed to mean 10^6 bits, and a Mibit means 2^20 bits.
The hard-drive manufactures didn't pioneer the *ibyte. I agree that for such things as RAM and HDDs, that use a power of 2 units (bytes or words and sectors), megabytes as 2^20 would be better. But I can't blame them for sticking with oficial standards.
Also, as your link points out, FDDs mixed the terms. 720 KB: 720*1024 bytes; 1.44 MB: 1.44*1000*1024 bytes.
Thus you end up with a gigabit card which is 10^6 bits but the OS measures in 2^20 bits.
The standard for network has always been for 10^6 bits per second or packets per second. Some protocols don't even align at 8 bit boundaries.