Millimeter
Definition: A millimeter (symbol: mm) is a unit of length in the International System of Units (SI). It is defined as 1/1000 of a meter.
History/origin: The milli- prefix is one of many metric prefixes. It indicates one thousandth of the base unit, in this case the meter. The definition of the meter has changed over time, the current definition being based on the distance traveled by the speed of light.
Current use: Millimeters are used to measure very small distances and lengths, often in engineering, manufacturing, and hardware specifications.
Meter
Definition: The meter (symbol: m) is the base unit of length in the International System of Units (SI). It is defined by the length of the path traveled by light in vacuum during a time interval of 1/299,792,458 of a second.
History/origin: The meter was first defined in France in 1793 as one ten-millionth of the distance from the equator to the North Pole through Paris.
Current use: It is the primary unit for distance measurement in science, engineering, and daily life in almost every country except the United States.
Millimeter to Meter Conversion Table
| Millimeter [mm] | Meter [m] |
|---|---|
| 0.01 mm | 0.00001 m |
| 0.1 mm | 0.0001 m |
| 1 mm | 0.001 m |
| 2 mm | 0.002 m |
| 3 mm | 0.003 m |
| 5 mm | 0.005 m |
| 10 mm | 0.01 m |
| 20 mm | 0.02 m |
| 50 mm | 0.05 m |
| 100 mm | 0.1 m |
| 1000 mm | 1 m |
How to Convert Millimeter to Meter
1 mm = 0.001 m
1 m = 1000 mm
Example: convert 15 mm to m:
15 mm = 15 × 0.001 m = 0.015 m
Did You Know?
- A millimeter is so small that a single sheet of standard paper is about 0.1 millimeters thick.
- The meter was originally defined in 1793 as one ten-millionth of the distance from the equator to the North Pole.