Milliamps to Amps
1 Milliampere (mA) = 0.001 Ampere (A)
How Many Amps in a Milliamp?
To convert milliamps to amps, divide the number of milliamps by 1,000. The formula is A = mA ÷ 1,000. For example, 500 mA equals 0.5 amps. This conversion is essential in electronics, electrical engineering, and everyday device troubleshooting. Milliamps (mA) are the standard unit for measuring small currents in electronic circuits — sensor readings, LED drive currents, microcontroller pin outputs, and battery drain specifications are all expressed in milliamps. However, household circuit ratings, fuse sizes, and wire gauge tables use amps (A). Whether you are designing a circuit board, checking if a USB port can supply enough current for a device, or determining the correct fuse for an automotive accessory, converting between milliamps and amps ensures you are comparing current values in compatible units.
How to Convert Milliampere to Ampere
- Start with the current value in milliamps (mA).
- Divide by 1,000 to get amps (A).
- The result is the current expressed in amps.
- Simply move the decimal point three places to the left.
- For example: 2,500 mA → 2.500 A → 2.5 A.
Real-World Examples
Quick Reference
| Milliampere (mA) | Ampere (A) |
|---|---|
| 1 | 0.001 |
| 2 | 0.002 |
| 5 | 0.005 |
| 10 | 0.01 |
| 25 | 0.025 |
| 50 | 0.05 |
| 100 | 0.1 |
| 500 | 0.5 |
| 1,000 | 1 |
History of Milliampere and Ampere
The ampere — named after French physicist André-Marie Ampère (1775-1836) — is one of the seven SI base units. Ampère is considered the father of electrodynamics for his mathematical framework describing the relationship between electric current and magnetic fields. The unit was officially adopted as part of the SI system in 1948. The milliampere (one-thousandth of an ampere) became a practical necessity as electronics miniaturized throughout the 20th century. Early electrical applications like telegraphy, lighting, and motors dealt with currents of amps or even hundreds of amps. But the invention of the vacuum tube, then the transistor (1947), and finally the integrated circuit (1958) drove circuit currents into the milliamp range. Today's microprocessors operate internal circuits at milliamp levels, while entire systems-on-a-chip can run on just a few milliamps in low-power modes. The SI redefinition of 2019 changed the ampere's formal definition from one based on the force between current-carrying wires to one based on the elementary charge (the charge of a single electron, 1.602176634 × 10⁻¹⁹ coulombs). This made the definition more precise without changing the practical size of the unit.
Common Mistakes to Avoid
- Multiplying instead of dividing. To go from milliamps to amps, divide by 1,000 — the mA value gets smaller. A common error is multiplying, which gives a result 1,000,000 times too large.
- Confusing milliamps (mA) with microamps (µA). A milliamp is 1,000 times larger than a microamp. Mixing them up leads to current calculations that are off by three orders of magnitude, which can damage sensitive components.
- Forgetting that "mA" and "ma" are not the same in strict SI notation. The correct symbol is "mA" (lowercase m, uppercase A). While colloquially interchangeable, using "MA" would technically mean megaamps.
Frequently Asked Questions
How many amps is 1 milliamp?
How many milliamps can a standard USB port supply?
What is a dangerous level of current in milliamps?
Why do electronics specs use milliamps instead of amps?
When estimating battery life for a portable device, divide the battery capacity (in mAh) by the device's current draw (in mA) to get approximate hours of operation. For example, a device drawing 50 mA from a 2,000 mAh battery will last about 2,000 ÷ 50 = 40 hours. In practice, battery life is usually 70-80% of this theoretical figure due to efficiency losses and voltage regulation.