🦉 UnitOwl

Amps to Milliamps

1 Ampere (A) = 1,000 Milliampere (mA)

Result
1,000 mA
1 A = 1,000 mA

How Many Milliamps in an Amp?

To convert amps to milliamps, multiply the number of amps by 1,000. The formula is mA = A × 1,000. For example, 0.5 amps equals 500 milliamps. This conversion is commonly needed when translating electrical specifications from large-scale systems (household wiring, industrial power) to the smaller-scale world of electronics. An electrician measuring current flow with a multimeter might read 0.35 A, but when specifying components for that circuit, the designer works in 350 mA. Battery specifications, charging circuits, sensor thresholds, and component datasheets all use milliamps as their default unit for current. Converting amps to milliamps bridges the gap between power engineering and electronics engineering, two fields that work at different scales but use the same fundamental physics.

How to Convert Ampere to Milliampere

  1. Start with the current value in amps (A).
  2. Multiply by 1,000 to get milliamps (mA).
  3. The result is the current expressed in milliamps.
  4. Simply move the decimal point three places to the right.
  5. For example: 0.75 A → 750. mA → 750 mA.

Real-World Examples

Multimeter reading — A circuit draws 0.35 A
0.35 × 1,000 = 350 mA. The component datasheet might specify a maximum of 400 mA, so this circuit is within limits.
Charger specification — A charger outputs 2.4 A
2.4 × 1,000 = 2,400 mA. A phone requiring 2,100 mA for fast charging will work fine with this charger.
Household circuit — A small appliance draws 0.05 A on standby
0.05 × 1,000 = 50 mA. This "phantom load" of 50 mA at 120V costs about $0.50/year — small but not zero across many devices.
Battery drain — A fitness tracker uses 0.008 A
0.008 × 1,000 = 8 mA. With a 200 mAh battery, the tracker would last about 200 ÷ 8 = 25 hours of continuous use.

Quick Reference

Ampere (A) Milliampere (mA)
1 1,000
2 2,000
5 5,000
10 10,000
25 25,000
50 50,000
100 100,000
500 500,000
1,000 1,000,000

History of Ampere and Milliampere

The need to express current in milliamps grew alongside the miniaturization of electronics. In the early days of electricity (late 1800s), currents were large — arc lamps drew 10 A, early motors pulled tens of amps, and household circuits were rated at 15-20 A. The milliamp was rarely needed. The vacuum tube era (1900s-1950s) changed this. Tube circuits operated with plate currents of 5-50 mA and grid currents in the microamp range. Engineers began routinely specifying milliamps in their designs. The transistor revolution amplified this trend — transistor circuits of the 1960s and 70s used milliamp-level currents, and the integrated circuit era pushed typical operating currents even lower. Today, the milliamp is arguably the most commonly used current unit in the electronics industry. Battery capacities are rated in mAh (milliamp-hours), LED forward currents are specified in mA, sensor outputs are measured in mA, and the ubiquitous 4-20 mA current loop remains the industrial standard for analog sensor communication — a protocol that has been in use since the 1950s and shows no signs of disappearing.

Common Mistakes to Avoid

  • Dividing instead of multiplying. To go from amps to milliamps, multiply by 1,000 — the number gets larger. A common error is dividing, which gives a result one-millionth of the correct value.
  • Confusing current (amps/mA) with charge capacity (amp-hours/mAh). Amps measure the rate of current flow at an instant, while amp-hours measure total charge capacity. A 2,000 mAh battery does not output 2,000 mA — it can output various current levels, and the mAh rating tells you how long it will last.

Frequently Asked Questions

How many milliamps is 1 amp?
One amp equals exactly 1,000 milliamps. The "milli" prefix means one-thousandth, so there are 1,000 milliamps in every amp.
What is the 4-20 mA current loop used in industry?
The 4-20 mA current loop is an industrial standard for analog signal transmission. A sensor outputs a current between 4 mA (representing the minimum measurement) and 20 mA (representing the maximum). The 4 mA baseline — rather than 0 mA — allows the system to distinguish between "minimum reading" and "broken wire" (0 mA). This protocol is highly noise-resistant and works over long cable distances.
How do I measure milliamps with a multimeter?
Set your multimeter to the mA or A range (depending on expected current), connect it in series with the circuit (break the circuit and insert the meter), and read the display. Most multimeters have a separate mA jack with a lower-rated fuse. Never connect the meter in parallel — this creates a short circuit and can damage the meter or circuit.
Quick Tip

When working with electronics, always check component maximum current ratings in milliamps before connecting power. Common limits to remember: a standard Arduino digital pin outputs a maximum of 40 mA (20 mA recommended), a Raspberry Pi GPIO pin should not exceed 16 mA, and a typical LED needs 10-20 mA with a current-limiting resistor. Exceeding these limits can permanently damage components.