Terawatt to Watt
TW
W
Conversion History
| Conversion | Reuse | Delete |
|---|---|---|
| No conversion history to show. | ||
Quick Reference Table (Terawatt to Watt)
| Terawatt (TW) | Watt (W) |
|---|---|
| 0.001 | 1,000,000,000 |
| 0.01 | 10,000,000,000 |
| 0.1 | 100,000,000,000 |
| 1 | 1,000,000,000,000 |
| 9 | 9,000,000,000,000 |
| 18 | 18,000,000,000,000 |
| 173,000 | 173,000,000,000,000,000 |
About Terawatt (TW)
A terawatt (TW) equals one trillion watts and is used to express global and continental energy consumption and total planetary power flux. Total human civilisation energy consumption is approximately 18 TW. The Sun delivers about 173,000 TW of power to the Earth's surface. National electricity grids operate at tens of gigawatts; continental-scale grids and global energy statistics require terawatt-scale framing. Ambitious long-term energy transition scenarios describe targets in terawatts of clean capacity.
Global electricity generation capacity is approximately 9 TW. Total human energy use (all forms — electricity, heat, transport) is about 18 TW.
About Watt (W)
The watt (W) is the SI unit of power, defined as one joule of energy transferred per second. It is the universal unit for electrical power, covering everything from a 1 W LED indicator light to a 3,000 W electric shower. Power consumption of appliances, power station output, and solar panel ratings are all expressed in watts or its multiples. One watt equals one volt multiplied by one ampere in a DC circuit, linking power directly to the foundational electrical quantities.
A modern LED bulb uses 8–10 W to produce the same light as a 60 W incandescent. A laptop draws 30–65 W; a microwave oven 800–1,200 W.
Etymology: Named after Scottish engineer James Watt (1736–1819), whose improvements to the steam engine drove the Industrial Revolution. The unit was adopted by the Second Congress of the British Association for the Advancement of Science in 1889.
Terawatt – Frequently Asked Questions
How much of the Sun's power hitting Earth would we need to capture?
The Sun delivers about 173,000 TW to Earth's surface. Human civilisation uses roughly 18 TW total. So we'd only need to capture 0.01% of incoming solar energy to power everything — an area of solar panels roughly 400 km × 400 km, about the size of Montana. The challenge isn't total energy availability; it's cost, storage, transmission, and the fact that sunlight is spread thin and intermittent.
What does 18 terawatts of human power consumption actually mean?
Imagine 18 trillion light bulbs burning continuously, or 9 billion people each running a 2 kW heater non-stop. That 18 TW figure includes everything — electricity, transport fuel, industrial heat, cooking, heating. About 40% comes from oil, 27% from coal, 24% from gas, and the rest from nuclear and renewables. The US alone accounts for about 3 TW despite having only 4% of world population.
How many terawatts of solar would end climate change?
Replacing all 18 TW of human energy with clean sources would require roughly 60–75 TW of installed solar capacity (accounting for ~25% average capacity factor). That's about 40 times current installed solar. At 2023 installation rates of ~0.4 TW/year, it would take 150 years — but installation rates are doubling every 2–3 years. If that exponential trend holds, we could theoretically reach 60 TW of solar within 15–20 years.
What is Earth's total internal heat flow in terawatts?
Earth radiates about 47 TW of geothermal heat from its interior, driven by radioactive decay and residual primordial heat. That's 2.5× human energy consumption, but it's spread across the entire surface at extremely low density (~0.09 W/m²). Iceland, sitting atop a mantle plume, exploits geothermal for 90% of its heating. Globally, geothermal electricity capacity is only about 16 GW — a tiny fraction of what's theoretically available.
Has human power consumption always been measured in terawatts?
No — the terawatt scale is a very recent phenomenon. In 1800, global human power consumption was about 0.5 TW (mostly biomass burning). By 1900 it reached 1 TW with coal industrialisation. We crossed 10 TW around 1985. The jump from 1 to 18 TW in just 120 years tracks almost perfectly with global population growth times rising per-capita energy use. Pre-industrial humans used about 0.1 kW each; Americans now average 10 kW per person.
Watt – Frequently Asked Questions
How many watts does a phone charger actually use?
A standard USB charger draws 5–10 W, while fast chargers pull 18–65 W and some proprietary ones hit 120–240 W. The charger itself consumes about 0.1–0.3 W even when nothing is plugged in — so-called "vampire power." Over a year, a plugged-in-but-idle charger wastes roughly 2 kWh, costing pennies but multiplied across billions of chargers worldwide it adds up to gigawatt-hours of waste.
Why is a watt called a watt and not a joule per second?
Both are identical — 1 W = 1 J/s — but the watt was named in 1889 to honor James Watt, who quantified engine power decades before the joule was formalised. Giving power its own name made practical engineering simpler: saying "a 60-watt bulb" is far catchier than "a 60-joules-per-second bulb." The naming also followed a 19th-century tradition of honoring scientists with SI units — volt, ampere, ohm, and watt all came from this era.
What wattage does a human body produce?
A resting adult generates about 80–100 W of thermal power, roughly equivalent to an old incandescent light bulb. During intense exercise this spikes to 300–500 W total metabolic output, though only 20–25% becomes mechanical work — the rest is waste heat. This is why a packed lecture hall gets stuffy fast: 200 students produce about 20 kW of heat, equivalent to running 20 space heaters.
How many watts is a lightning bolt?
A single lightning stroke delivers about 1–5 billion watts (1–5 GW) of instantaneous power, but only for 1–2 milliseconds. The total energy per bolt is surprisingly modest — roughly 1–5 billion joules compressed into microseconds, equivalent to about 250 kWh or one month of a US household. You could theoretically power a town for a second, but capturing it is impractical because the pulse is too brief and unpredictable.
What is the difference between watts and watt-hours?
Watts measure the rate of energy flow (like the speed of water through a pipe), while watt-hours measure total energy consumed over time (like the total volume of water). A 100 W bulb running for 10 hours uses 1,000 Wh (1 kWh). Your electricity bill charges per kWh, not per watt — so a 2,000 W heater running one hour costs the same as a 100 W lamp running 20 hours.