What is a Megabyte?

One of the many confusing issues in the computer world is the fact that a megabyte (MB) is defined in two different ways.

In the metric system, the prefix kilo means 1,000, the prefix mega means 1,000,000, and the prefix giga means 1,000,000,000.

So it would follow that a megabyte is 1,000,000 bytes. This can also be expressed as 10 to the 6th power.

All electronic signals in a computer system are in the form of 1's and 0's normally represented by the presence or absence of +5volts DC. So all programs, all data, and all calculations are in the form of binary numbers. Since computers use the binary number system, they work with bits (binary digits), bytes (8 bits), and powers of 2. This system makes it convenient to refer to 2 to the 10th power (1024 bytes) as a kilobyte, 2 to 20th power (1,048,576 bytes) as a megabyte, and 2 to the 30th power (1,073,741,824) as a gigabyte.

So a megabyte can be 1,000,000 bytes or 1,048,576 bytes. Both of these definitions for megabyte are in current use and both are correctly called megabyte. In the computer world, memory is always discussed in terms of binary megabytes. It has become common to measure hard drive capacity in terms of decimal megabytes as well as in binary megabytes.

The terms "decimal megabyte" and "binary megabyte" can be used to differentiate although these aren't really standard terms. Millionsbytes is another term sometimes used for decimal megabytes. An even more obscure term is miobytes.


   To convert decimal MB to binary MB:
decimal MB x 1,000,000 
---------------------- = binary MB 
1,048,576 

To convert binary MB to decimal MB:

binary MB x 1.048576 = decimal MB 

More information on this subject is found in our file METRIC.TXT, WWW URL: http://www.firmware.com/support/bios/metric.htm