British Thermal Units to Watt Hour
BTU
Wh
Conversion History
| Conversion | Reuse | Delete |
|---|---|---|
| No conversion history to show. | ||
Quick Reference Table (British Thermal Units to Watt Hour)
| British Thermal Units (BTU) | Watt Hour (Wh) |
|---|---|
| 1 | 0.29307107016666666667 |
| 100 | 29.30710701666666666667 |
| 1,000 | 293.07107016666666666667 |
| 10,000 | 2,930.71070166666666666667 |
| 100,000 | 29,307.10701666666666666667 |
| 1,000,000 | 293,071.07016666666666666667 |
About British Thermal Units (BTU)
The British thermal unit (BTU) is the amount of heat required to raise one pound of water by one degree Fahrenheit at its maximum density (~39°F). One BTU equals approximately 1,055 joules. It remains the dominant unit for heating and cooling equipment in the United States — air conditioners, furnaces, heat pumps, and water heaters are all rated in BTU or BTU/hour. Natural gas prices in the US are quoted in dollars per million BTU (MMBtu).
A standard residential air conditioner is rated at 10,000–24,000 BTU/hour. Burning one kitchen match releases roughly 1 BTU of heat.
Etymology: Developed in the 19th century alongside the rise of steam engineering in Britain and the US, standardized as the energy needed to raise one pound of water by one degree Fahrenheit. The "British" name stuck even as the UK adopted SI units.
About Watt Hour (Wh)
A watt-hour (Wh) is the energy consumed or produced by a one-watt device operating for one hour, equal to 3,600 joules. It is widely used for small battery and energy storage capacities — smartphone batteries, power banks, and small electronic devices. A smartphone battery holds roughly 10–15 Wh; a laptop 50–100 Wh. The watt-hour is the stepping-stone unit between the joule (too small for practical appliance use) and the kilowatt-hour (the billing unit for mains electricity).
A phone charger running for an hour uses about 5–10 Wh. A 100 Wh portable power bank can charge a typical smartphone about seven times.
British Thermal Units – Frequently Asked Questions
Why are air conditioners rated in BTU instead of watts?
US HVAC manufacturers adopted BTU/hour because heating and cooling equipment historically measured heat removal or addition, not electrical input. A 12,000 BTU/h window unit removes 12,000 BTU of heat per hour from a room — that figure directly tells you the cooling capacity. Watts measure electrical power consumed, which is less due to the efficiency (EER) of the unit. The convention stuck because the entire US supply chain uses it.
How many BTU does it take to heat a room?
A rough rule of thumb is 20 BTU per square foot of living space in a temperate climate. A 300 sq ft bedroom needs about 6,000 BTU/h; a 1,500 sq ft open-plan living area needs roughly 30,000 BTU/h. Actual requirements vary with insulation, ceiling height, climate zone, and window area. Poorly insulated older homes may need 30–40 BTU per square foot.
What is the difference between BTU and BTU/h?
BTU is a unit of energy (heat); BTU/h is a unit of power (rate of heat flow). When an air conditioner is labelled "12,000 BTU," the industry shorthand actually means 12,000 BTU per hour. Technically one BTU equals about 1,055 joules of energy, while 1 BTU/h equals about 0.293 watts. The distinction matters for energy calculations but is routinely blurred in product marketing.
How does the BTU relate to natural gas pricing in the US?
US natural gas is priced in dollars per million BTU (MMBtu) at the wholesale level and dollars per therm (100,000 BTU) on residential bills. One cubic foot of pipeline gas contains roughly 1,020 BTU. The Henry Hub benchmark price of $2.50/MMBtu means each therm costs about $0.25 wholesale — residential prices are higher after delivery and utility markups.
Why does the UK no longer use British thermal units despite the name?
The UK metricated energy units in the 1970s–1990s, switching gas billing from therms (100,000 BTU) to kilowatt-hours and scientific work to joules. The "British" in BTU reflects 19th-century British steam engineering origins, not current usage. Today the BTU is almost exclusively an American unit, used for HVAC, gas pricing, and appliance ratings across the US.
Watt Hour – Frequently Asked Questions
Why are portable battery capacities listed in watt-hours instead of milliamp-hours?
Watt-hours account for both current and voltage, giving the true energy stored. A 10,000 mAh power bank at 3.7 V holds 37 Wh, but at 5 V output it delivers only about 7,400 mAh due to voltage conversion losses. Airlines use the Wh rating (max 100 Wh carry-on) because it reflects actual energy — and therefore actual fire risk — regardless of battery voltage.
How many watt-hours does a typical smartphone battery hold?
Most smartphones have batteries rated at 10–18 Wh. An iPhone 15 Pro holds about 12.7 Wh; a Samsung Galaxy S24 Ultra about 18.4 Wh. For context, fully charging an 18 Wh phone from a wall outlet costs less than 0.01 kWh — roughly one-tenth of a cent on a typical electricity bill.
What is the airline limit for lithium batteries in watt-hours?
Most airlines allow lithium-ion batteries up to 100 Wh in carry-on luggage without approval. Batteries between 100 and 160 Wh (e.g., large camera or drone batteries) require airline permission, and batteries above 160 Wh are banned from passenger flights. A standard laptop battery is 50–100 Wh; a large power tool battery can exceed 160 Wh.
Why did the electronics industry settle on watt-hours instead of joules for battery labels?
Watt-hours map directly to how consumers think about devices: a 50 Wh battery powering a 10 W laptop lasts about 5 hours — simple division. Expressing the same battery as 180,000 joules gives no intuitive sense of runtime. Airlines also adopted Wh for lithium battery safety limits (100 Wh carry-on threshold) because it communicates energy density risk in a unit engineers and passengers can both grasp.
How many watt-hours does it cost to charge a laptop?
A typical laptop battery holds 50–100 Wh, so a full charge from empty uses 50–100 Wh of energy (plus about 10–15% lost as heat in the charger). At average US electricity rates, that is roughly 1–2 cents per charge. Over a year of daily charging, a laptop costs about $4–$7 in electricity — far less than most people assume.