Bits
Definition: A bit (binary digit) is the most basic unit of information in computing and digital communications. It can represent only two values: 0 or 1 (off or on).
History/origin: The term "bit" was first used by mathematician John W. Tukey in 1947, and later popularized by Claude Shannon, the "father of information theory," in 1948.
Current use: Bits are primarily used to measure data transfer rates (like internet speeds in Mbps), whereas bytes are used to measure file sizes.
Gigabytes
Definition: A gigabyte (symbol: GB) is a unit of digital data equal to 1,024 megabytes, or roughly one billion bytes.
History/origin: Gigabytes entered consumer consciousness in the late 1990s as personal computer hard drives grew larger. The first 1 GB hard drive was introduced by IBM in 1980 and weighed 550 pounds!
Current use: GB is the most common unit used today to measure smartphone storage capacity, computer RAM, and high-definition movie file sizes.
Bits to Gigabytes Conversion Table
| Bits [bit] | Gigabytes [gb] |
|---|---|
| 0.01 bit | 0 gb |
| 0.1 bit | 0 gb |
| 1 bit | 0 gb |
| 2 bit | 0 gb |
| 3 bit | 0 gb |
| 5 bit | 0 gb |
| 10 bit | 0 gb |
| 20 bit | 0 gb |
| 50 bit | 0.00000001 gb |
| 100 bit | 0.00000001 gb |
| 1000 bit | 0.00000012 gb |
How to Convert Bits to Gigabytes
1 bit = 0 gb
1 gb = 8589934592 bit
Example: convert 15 bit to gb:
15 bit = 15 × 0 gb = 0 gb
Did You Know?
- The word "bit" is a blend of "binary digit." It is the smallest unit of data in a computer and has a single binary value, either 0 or 1.
- One Gigabyte (GB) can hold approximately 250 downloaded songs, or about 2-3 hours of standard-definition video.