[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Debian virus/spy-ware detection and detection technique.

On 7/26/10 4:38 PM, Boyd Stephen Smith Jr. wrote:
Not according to the relevant standards.

1Mb  = 1 000 000 bits
1MB  = 1 000 000 bytes
1Mib = 2 ^ 20 bits
1MiB = 2 ^ 20 bytes


You need to explain to people what these values actually mean, technically both of you are right, but neither of you understand how or where you are right, apparently (not saying you do or don't, just adding apparently because it seems this way.) In memory calculations yes the following table applies AKA base-two meaning:

1 megabyte (MB)  = 8,388,608 bits
1 megabit (Mb) = 1,048,576 bits
1 mebibyte (MiB) = 8,388,608 bits
1 mebibit (Mib) = 1 048 576 bits

1 megabyte (MB) = 1,048,576 bytes
1 megabit (Mb) = 131,072 bytes
1 mebibyte (MiB) = 1,048,576 bytes
1 mebibit (Mib) = 131,072 bytes

The reason some major companies (that don't like to play the line of politically-correct or politically-incorrect or you're an idiot or you should learn to computer foo) switched to measuring in MiB for storage is because people think that base-two meaning is the correct measurment for storage, when according to standards yes Mega means a million (10^6). As a matter of fact, most computer scientists still readily and unreliably use base-two meaning (2^20) for measurement :/

The IEC added some terms (KiB, MiB, GiB, TiB, PiB, EiB) to ease the confusion (which nobody outside of the computer industry noticed) which does use the base 2 calculations and not base 10. So when you think hard drive, think 10^6 when you think computer memory think 2^20. When you think, blame Americans, this is one thing you truly can blame on us.

Reply to: