Unit Converter Free

Data Converter

Search

Bits

Definition: A bit (binary digit) is the most basic unit of information in computing and digital communications. It can represent only two values: 0 or 1 (off or on).

History/origin: The term "bit" was first used by mathematician John W. Tukey in 1947, and later popularized by Claude Shannon, the "father of information theory," in 1948.

Current use: Bits are primarily used to measure data transfer rates (like internet speeds in Mbps), whereas bytes are used to measure file sizes.

Megabytes

Definition: A megabyte (symbol: MB) is a unit of digital information equal to 1,024 kilobytes, or 1,048,576 bytes.

History/origin: The term megabyte became mainstream in the 1980s and 1990s. The classic 3.5-inch floppy disk, which revolutionized portable storage, could hold 1.44 MB of data.

Current use: Megabytes are the standard unit for measuring the size of MP3 audio files, high-resolution smartphone photos, and mobile app downloads.

Bits to Megabytes Conversion Table

Bits [bit]Megabytes [mb]
0.01 bit0 mb
0.1 bit0.00000001 mb
1 bit0.00000012 mb
2 bit0.00000024 mb
3 bit0.00000036 mb
5 bit0.0000006 mb
10 bit0.00000119 mb
20 bit0.00000238 mb
50 bit0.00000596 mb
100 bit0.00001192 mb
1000 bit0.00011921 mb

How to Convert Bits to Megabytes

1 bit = 0.00000012 mb
1 mb = 8388608 bit

Example: convert 15 bit to mb:
15 bit = 15 × 0.00000012 mb = 0.00000179 mb

Did You Know?

  • The word "bit" is a blend of "binary digit." It is the smallest unit of data in a computer and has a single binary value, either 0 or 1.
Scroll to Top