Bits
Definition: A bit (binary digit) is the most basic unit of information in computing and digital communications. It can represent only two values: 0 or 1 (off or on).
History/origin: The term "bit" was first used by mathematician John W. Tukey in 1947, and later popularized by Claude Shannon, the "father of information theory," in 1948.
Current use: Bits are primarily used to measure data transfer rates (like internet speeds in Mbps), whereas bytes are used to measure file sizes.
Terabytes
Definition: A terabyte (symbol: TB) is a massive unit of digital data equal to 1,024 gigabytes, or roughly one trillion bytes.
History/origin: The first 1 TB hard drive for consumers was released by Hitachi in 2007. Before that, terabytes were exclusively discussed in the context of enterprise servers and supercomputers.
Current use: Terabytes are now commonly used to measure the capacity of external hard drives, modern gaming consoles, network-attached storage (NAS), and large enterprise databases.
Bits to Terabytes Conversion Table
| Bits [bit] | Terabytes [tb] |
|---|---|
| 0.01 bit | 0 tb |
| 0.1 bit | 0 tb |
| 1 bit | 0 tb |
| 2 bit | 0 tb |
| 3 bit | 0 tb |
| 5 bit | 0 tb |
| 10 bit | 0 tb |
| 20 bit | 0 tb |
| 50 bit | 0 tb |
| 100 bit | 0 tb |
| 1000 bit | 0 tb |
How to Convert Bits to Terabytes
1 bit = 0 tb
1 tb = 8796093022208 bit
Example: convert 15 bit to tb:
15 bit = 15 × 0 tb = 0 tb
Did You Know?
- The word "bit" is a blend of "binary digit." It is the smallest unit of data in a computer and has a single binary value, either 0 or 1.
- A Terabyte (TB) is massive! One TB can hold around 250,000 photos taken with a modern smartphone, or about 500 hours of HD video.