Watt to Megawatt

W

1 W

MW

0.000001 MW

Conversion History

ConversionReuseDelete
No conversion history to show.

Entries per page:

0–0 of 0


Quick Reference Table (Watt to Megawatt)

Watt (W)Megawatt (MW)
10.000001
100.00001
600.00006
1000.0001
8000.0008
1,2000.0012
2,0000.002

About Watt (W)

The watt (W) is the SI unit of power, defined as one joule of energy transferred per second. It is the universal unit for electrical power, covering everything from a 1 W LED indicator light to a 3,000 W electric shower. Power consumption of appliances, power station output, and solar panel ratings are all expressed in watts or its multiples. One watt equals one volt multiplied by one ampere in a DC circuit, linking power directly to the foundational electrical quantities.

A modern LED bulb uses 8–10 W to produce the same light as a 60 W incandescent. A laptop draws 30–65 W; a microwave oven 800–1,200 W.

Etymology: Named after Scottish engineer James Watt (1736–1819), whose improvements to the steam engine drove the Industrial Revolution. The unit was adopted by the Second Congress of the British Association for the Advancement of Science in 1889.

About Megawatt (MW)

A megawatt (MW) equals one million watts and is the standard unit for power station output, large industrial facilities, and grid-scale renewable energy. A single onshore wind turbine generates 2–5 MW at full capacity. A large gas peaker plant might output 100–500 MW. Data centers consume tens to hundreds of megawatts. Utility-scale solar and battery storage projects are sized in megawatts.

A 2 MW wind turbine at 40% capacity factor produces about 700 MWh per month. A large hospital might draw 10–30 MW of electrical power continuously.


Watt – Frequently Asked Questions

A standard USB charger draws 5–10 W, while fast chargers pull 18–65 W and some proprietary ones hit 120–240 W. The charger itself consumes about 0.1–0.3 W even when nothing is plugged in — so-called "vampire power." Over a year, a plugged-in-but-idle charger wastes roughly 2 kWh, costing pennies but multiplied across billions of chargers worldwide it adds up to gigawatt-hours of waste.

Both are identical — 1 W = 1 J/s — but the watt was named in 1889 to honor James Watt, who quantified engine power decades before the joule was formalised. Giving power its own name made practical engineering simpler: saying "a 60-watt bulb" is far catchier than "a 60-joules-per-second bulb." The naming also followed a 19th-century tradition of honoring scientists with SI units — volt, ampere, ohm, and watt all came from this era.

A resting adult generates about 80–100 W of thermal power, roughly equivalent to an old incandescent light bulb. During intense exercise this spikes to 300–500 W total metabolic output, though only 20–25% becomes mechanical work — the rest is waste heat. This is why a packed lecture hall gets stuffy fast: 200 students produce about 20 kW of heat, equivalent to running 20 space heaters.

A single lightning stroke delivers about 1–5 billion watts (1–5 GW) of instantaneous power, but only for 1–2 milliseconds. The total energy per bolt is surprisingly modest — roughly 1–5 billion joules compressed into microseconds, equivalent to about 250 kWh or one month of a US household. You could theoretically power a town for a second, but capturing it is impractical because the pulse is too brief and unpredictable.

Watts measure the rate of energy flow (like the speed of water through a pipe), while watt-hours measure total energy consumed over time (like the total volume of water). A 100 W bulb running for 10 hours uses 1,000 Wh (1 kWh). Your electricity bill charges per kWh, not per watt — so a 2,000 W heater running one hour costs the same as a 100 W lamp running 20 hours.

Megawatt – Frequently Asked Questions

In the US, roughly 750–1,000 homes (average consumption ~1.2 kW per home). In Europe, where usage is lower, 1 MW can serve 1,500–2,000 homes. But this is average — on a hot summer afternoon when everyone cranks AC, that number can drop to 300–400 homes. Grid planners must size for peak demand, not averages, which is why installed capacity far exceeds average load.

A small data center uses 1–5 MW; a large hyperscale facility (Google, AWS, Microsoft) draws 50–200 MW — some exceeding 300 MW. The entire US data center industry consumed about 17 GW in 2023, roughly 4% of national electricity. AI training clusters are pushing demand higher: a single large GPU cluster can draw 50–100 MW, and planned AI-focused campuses target 1 GW or more.

Onshore turbines typically rate 2–6 MW; the latest offshore monsters reach 14–16 MW per turbine. Vestas' V236-15.0 MW turbine has a rotor diameter of 236 meters — wider than two football fields. A single sweep of its blades can generate enough electricity for a UK household for two days. Capacity factors run 25–45% onshore and 40–55% offshore, so actual average output is roughly half the nameplate rating.

Most operating reactors produce 500–1,400 MW of electrical power. The world's largest, at France's Gravelines plant, has six reactors totalling 5,460 MW. Small Modular Reactors (SMRs) being developed target 50–300 MW each. Nuclear plants run at 85–95% capacity factor — far higher than wind (~35%) or solar (~25%) — meaning a 1,000 MW reactor actually delivers about 900 MW on average.

MW tells you the maximum instantaneous power the battery can deliver (how fast it can discharge), while MWh tells you total stored energy (how long it can sustain that output). A 100 MW / 400 MWh battery can deliver 100 MW for 4 hours, or 50 MW for 8 hours. Grid operators care about both: MW for handling sudden demand spikes, MWh for sustained backup during extended outages or evening solar fade.

© 2026 TopConverters.com. All rights reserved.