Therm (EC) to Watt Hour
thm-ec
Wh
Conversion History
| Conversion | Reuse | Delete |
|---|---|---|
| No conversion history to show. | ||
Quick Reference Table (Therm (EC) to Watt Hour)
| Therm (EC) (thm-ec) | Watt Hour (Wh) |
|---|---|
| 0.1 | 2,930.71111111111111111111 |
| 0.5 | 14,653.55555555555555555556 |
| 1 | 29,307.11111111111111111111 |
| 5 | 146,535.55555555555555555556 |
| 10 | 293,071.11111111111111111111 |
| 50 | 1,465,355.55555555555555555556 |
| 100 | 2,930,711.11111111111111111111 |
About Therm (EC) (thm-ec)
The therm (EC) is an energy unit defined by the European Community as exactly 105,505,600 joules (approximately 100,000 BTU). It is used for natural gas billing and trading in European energy markets. Gas meters in the UK traditionally measured in cubic feet or therms before metrication moved billing to kWh. One therm (EC) equals 29.3 kWh and is roughly the energy content of about 100 cubic feet of natural gas.
A UK gas bill covering heating and hot water might show 500–800 therms of consumption per year for an average home. One therm heats roughly 300 liters of water from cold to hot.
About Watt Hour (Wh)
A watt-hour (Wh) is the energy consumed or produced by a one-watt device operating for one hour, equal to 3,600 joules. It is widely used for small battery and energy storage capacities — smartphone batteries, power banks, and small electronic devices. A smartphone battery holds roughly 10–15 Wh; a laptop 50–100 Wh. The watt-hour is the stepping-stone unit between the joule (too small for practical appliance use) and the kilowatt-hour (the billing unit for mains electricity).
A phone charger running for an hour uses about 5–10 Wh. A 100 Wh portable power bank can charge a typical smartphone about seven times.
Therm (EC) – Frequently Asked Questions
What is the difference between the EC therm and the US therm?
The EC therm is defined as exactly 105,505,600 joules; the US therm is 105,480,400 joules — a difference of 25,200 J (about 0.024%). The discrepancy arose from slightly different historical BTU definitions. For residential gas billing the difference is negligible, but in large-scale energy trading involving millions of therms, the distinction can affect settlement amounts.
Why did the UK switch from therms to kilowatt-hours for gas billing?
The UK Gas Act 1995 mandated a switch from therms to kWh as part of broader metrication. One therm (EC) equals 29.3071 kWh. The change aligned gas billing with electricity billing, making it easier for consumers to compare energy costs. Older UK customers and industry veterans still refer to therms colloquially, and wholesale gas markets continued using therms for years after the retail switch.
How many therms does a UK household use per year?
A typical UK home uses 500–800 therms (EC) per year for heating and hot water, equivalent to roughly 14,700–23,400 kWh. Well-insulated newer homes may use under 400 therms, while large Victorian houses with poor insulation can exceed 1,200 therms. Ofgem's energy price cap is set in pence per kWh, but converting back to therms gives about £2.50–£3.50 per therm at recent rates.
How does the EC therm relate to cubic meters of natural gas?
One cubic meter of UK pipeline-quality natural gas contains roughly 38.5–39.5 MJ, which is about 0.365–0.374 therms (EC). Gas meters measure volume in cubic meters, and the utility applies a calorific value correction to convert to kWh (or therms). The correction factor varies by region and season because gas composition changes depending on the source field.
Is the therm still used in European energy markets?
The therm (EC) was once the standard trading unit on the UK's NBP (National Balancing Point) gas market. In 2020, the ICE exchange switched NBP contracts from pence per therm to pence per kWh. Continental European hubs like TTF have always traded in euros per MWh. The therm is fading from professional use but remains in legacy contracts and older billing systems.
Watt Hour – Frequently Asked Questions
Why are portable battery capacities listed in watt-hours instead of milliamp-hours?
Watt-hours account for both current and voltage, giving the true energy stored. A 10,000 mAh power bank at 3.7 V holds 37 Wh, but at 5 V output it delivers only about 7,400 mAh due to voltage conversion losses. Airlines use the Wh rating (max 100 Wh carry-on) because it reflects actual energy — and therefore actual fire risk — regardless of battery voltage.
How many watt-hours does a typical smartphone battery hold?
Most smartphones have batteries rated at 10–18 Wh. An iPhone 15 Pro holds about 12.7 Wh; a Samsung Galaxy S24 Ultra about 18.4 Wh. For context, fully charging an 18 Wh phone from a wall outlet costs less than 0.01 kWh — roughly one-tenth of a cent on a typical electricity bill.
What is the airline limit for lithium batteries in watt-hours?
Most airlines allow lithium-ion batteries up to 100 Wh in carry-on luggage without approval. Batteries between 100 and 160 Wh (e.g., large camera or drone batteries) require airline permission, and batteries above 160 Wh are banned from passenger flights. A standard laptop battery is 50–100 Wh; a large power tool battery can exceed 160 Wh.
Why did the electronics industry settle on watt-hours instead of joules for battery labels?
Watt-hours map directly to how consumers think about devices: a 50 Wh battery powering a 10 W laptop lasts about 5 hours — simple division. Expressing the same battery as 180,000 joules gives no intuitive sense of runtime. Airlines also adopted Wh for lithium battery safety limits (100 Wh carry-on threshold) because it communicates energy density risk in a unit engineers and passengers can both grasp.
How many watt-hours does it cost to charge a laptop?
A typical laptop battery holds 50–100 Wh, so a full charge from empty uses 50–100 Wh of energy (plus about 10–15% lost as heat in the charger). At average US electricity rates, that is roughly 1–2 cents per charge. Over a year of daily charging, a laptop costs about $4–$7 in electricity — far less than most people assume.