Watt per volt to Ampere
W/V
A
Conversion History
| Conversion | Reuse | Delete |
|---|---|---|
1 W/V (Watt per volt) → 1 A (Ampere) Just now |
Quick Reference Table (Watt per volt to Ampere)
| Watt per volt (W/V) | Ampere (A) |
|---|---|
| 0.1 | 0.1 |
| 1 | 1 |
| 5 | 5 |
| 10 | 10 |
| 20 | 20 |
| 100 | 100 |
About Watt per volt (W/V)
The watt per volt (W/V) equals one ampere, derived from the power relationship P = IV rearranged as I = P/V. A device consuming 60 W at 120 V draws 0.5 W/V = 0.5 A. The W/V form is most useful when calculating branch currents from known power ratings and supply voltages — for appliance load calculations, transformer secondary currents, or power budget analysis on a circuit board. Numerically identical to the ampere, it provides an alternative view emphasising the power-per-volt character of current and is common in power electronics and electrical installation design.
A 100 W light bulb on a 230 V supply draws approximately 0.43 W/V. A 60 W laptop adapter at 20 V delivers 3 W/V to the device.
About Ampere (A)
The ampere (A) is the SI base unit of electric current, one of the seven fundamental units in the International System. Since the 2019 SI redefinition, one ampere is exactly the flow of 1/1.602176634×10⁻¹⁹ elementary charges per second, fixing the elementary charge precisely. In practice, a 100 W bulb at 240 V draws about 0.4 A; a domestic kettle draws 8–13 A; household ring circuits are protected at 20–32 A; car starter motors demand brief surges of 100–200 A. The ampere defines related units: one volt across one ohm yields one ampere (Ohm s law), and one ampere for one second transfers one coulomb of charge.
A smartphone fast charger delivers 2–5 A. A household circuit breaker protects wiring rated at 10–32 A.
Etymology: Named after André-Marie Ampère (1775–1836), French physicist and mathematician who formulated Ampère s circuital law relating magnetic fields to the electric currents that produce them. The ampere was adopted as a practical electrical unit at the International Electrical Congress in 1881.
Watt per volt – Frequently Asked Questions
Why would an electrician think in watts per volt?
When sizing circuits, electricians know the appliance power (watts from the nameplate) and the supply voltage (120 V or 230 V). Dividing watts by volts gives the current in amps — which is what determines wire gauge and breaker size. "1,800 W ÷ 120 V = 15 A, so I need a 20 A circuit" is daily electrician math.
Is watts per volt ever written on any product label?
No — product labels list watts, volts, and amps separately. The W/V expression lives in textbooks and engineering calculations. But every time you read "1,500 W, 120 V" on a space heater and mentally divide to get 12.5 A, you are computing watts per volt without calling it that.
Does the watts-per-volt calculation work for AC power?
Only approximately. For AC, real power (watts) = V × I × power factor. So I = W / (V × PF). A motor rated at 1,000 W with a power factor of 0.85 on 230 V actually draws 1,000 / (230 × 0.85) = 5.1 A, not the 4.35 A that simple W/V would suggest. Always account for power factor in AC circuits.
How does watts per volt help with USB power delivery calculations?
USB PD negotiates voltage levels (5 V, 9 V, 15 V, 20 V) and maximum power (up to 240 W). Dividing the negotiated power by voltage gives the cable current: 100 W at 20 V = 5 A, requiring a 5 A rated cable. At 5 V the same 100 W would need 20 A — which is why PD uses higher voltages.
What is the relationship between watts per volt and Ohm's law?
From P = IV and V = IR, you get I = P/V = V/R = P^(1/2)/R^(1/2). The W/V form is just one of many equivalent expressions for current. Which one you use depends on what you know: power and voltage gives W/V, voltage and resistance gives V/R (Ohm's law directly).
Ampere – Frequently Asked Questions
Why was the ampere redefined in 2019?
The old definition relied on a thought experiment — infinite parallel wires 1 meter apart — that was impossible to realize exactly in a lab. The 2019 redefinition fixed the elementary charge at exactly 1.602176634×10⁻¹⁹ coulombs, linking the ampere to a countable number of electrons per second and enabling more precise quantum-based measurements.
How many amps does a house use at peak?
A typical US home has a 200-amp service panel. Peak usage — oven, dryer, AC, and water heater all running — might hit 80–150 A across all circuits combined. The 200 A main breaker protects the service entrance cable. European homes typically have 32–63 A single-phase service at 230 V, delivering equivalent power.
Why do electricians say "it is the amps that kill you, not the volts"?
Current through the heart causes fibrillation and death — as little as 0.1 A at 50/60 Hz. But voltage drives that current through your body's resistance (~1,000–100,000 ohms depending on conditions). So you need enough voltage to push lethal current through skin resistance. Both matter; the saying is a simplification.
What happens inside a circuit breaker the instant current exceeds its rating?
A thermal-magnetic breaker has two trip mechanisms. For sustained overloads (e.g., 20 A on a 15 A breaker), a bimetallic strip slowly heats and bends until it releases the latch — taking seconds to minutes depending on the overload. For short circuits (hundreds of amps), an electromagnet yanks the latch open in milliseconds. The contacts separate and an arc forms; arc chutes — stacked steel plates — split the arc into segments, cool it, and extinguish it within one AC cycle (16–20 ms). Modern breakers can interrupt 10,000–65,000 A fault currents.
How does a clamp meter measure amps without touching the wire?
A clamp meter wraps a magnetic core around a current-carrying conductor. AC current creates an alternating magnetic field that induces a proportional voltage in the clamp's pickup coil. Hall-effect clamp meters can also measure DC. No electrical contact needed — you just close the jaws around the insulated wire.