Amps to Milliamps
1 Ampere (A) = 1,000 Milliampere (mA)
How Many Milliamps in an Amp?
To convert amps to milliamps, multiply the number of amps by 1,000. The formula is mA = A × 1,000. For example, 0.5 amps equals 500 milliamps. This conversion is commonly needed when translating electrical specifications from large-scale systems (household wiring, industrial power) to the smaller-scale world of electronics. An electrician measuring current flow with a multimeter might read 0.35 A, but when specifying components for that circuit, the designer works in 350 mA. Battery specifications, charging circuits, sensor thresholds, and component datasheets all use milliamps as their default unit for current. Converting amps to milliamps bridges the gap between power engineering and electronics engineering, two fields that work at different scales but use the same fundamental physics.
How to Convert Ampere to Milliampere
- Start with the current value in amps (A).
- Multiply by 1,000 to get milliamps (mA).
- The result is the current expressed in milliamps.
- Simply move the decimal point three places to the right.
- For example: 0.75 A → 750. mA → 750 mA.
Real-World Examples
Quick Reference
| Ampere (A) | Milliampere (mA) |
|---|---|
| 1 | 1,000 |
| 2 | 2,000 |
| 5 | 5,000 |
| 10 | 10,000 |
| 25 | 25,000 |
| 50 | 50,000 |
| 100 | 100,000 |
| 500 | 500,000 |
| 1,000 | 1,000,000 |
History of Ampere and Milliampere
The need to express current in milliamps grew alongside the miniaturization of electronics. In the early days of electricity (late 1800s), currents were large — arc lamps drew 10 A, early motors pulled tens of amps, and household circuits were rated at 15-20 A. The milliamp was rarely needed. The vacuum tube era (1900s-1950s) changed this. Tube circuits operated with plate currents of 5-50 mA and grid currents in the microamp range. Engineers began routinely specifying milliamps in their designs. The transistor revolution amplified this trend — transistor circuits of the 1960s and 70s used milliamp-level currents, and the integrated circuit era pushed typical operating currents even lower. Today, the milliamp is arguably the most commonly used current unit in the electronics industry. Battery capacities are rated in mAh (milliamp-hours), LED forward currents are specified in mA, sensor outputs are measured in mA, and the ubiquitous 4-20 mA current loop remains the industrial standard for analog sensor communication — a protocol that has been in use since the 1950s and shows no signs of disappearing.
Common Mistakes to Avoid
- Dividing instead of multiplying. To go from amps to milliamps, multiply by 1,000 — the number gets larger. A common error is dividing, which gives a result one-millionth of the correct value.
- Confusing current (amps/mA) with charge capacity (amp-hours/mAh). Amps measure the rate of current flow at an instant, while amp-hours measure total charge capacity. A 2,000 mAh battery does not output 2,000 mA — it can output various current levels, and the mAh rating tells you how long it will last.
Frequently Asked Questions
How many milliamps is 1 amp?
What is the 4-20 mA current loop used in industry?
How do I measure milliamps with a multimeter?
When working with electronics, always check component maximum current ratings in milliamps before connecting power. Common limits to remember: a standard Arduino digital pin outputs a maximum of 40 mA (20 mA recommended), a Raspberry Pi GPIO pin should not exceed 16 mA, and a typical LED needs 10-20 mA with a current-limiting resistor. Exceeding these limits can permanently damage components.