Gigabit to Megabit
Gb
Mb
Conversion History
| Conversion | Reuse | Delete |
|---|---|---|
| No conversion history to show. | ||
Quick Reference Table (Gigabit to Megabit)
| Gigabit (Gb) | Megabit (Mb) |
|---|---|
| 0.1 | 100 |
| 0.5 | 500 |
| 1 | 1,000 |
| 2.5 | 2,500 |
| 10 | 10,000 |
| 25 | 25,000 |
| 100 | 100,000 |
About Gigabit (Gb)
A gigabit (Gb or Gbit) equals 1,000,000,000 bits (10⁹ bits) in the SI system. It is the standard unit for high-speed networking: home broadband is marketed in gigabits (1 Gbps, 2.5 Gbps), data center switches operate at 10–400 Gbps, and optical fiber backbone links run at terabit speeds. Network interface cards (NICs) in modern computers and servers are typically rated at 1 Gbps or 10 Gbps. A 1 Gbps link can transfer roughly 125 MB per second — sufficient to copy a 1 GB file in about 8 seconds under ideal conditions.
A 1 Gbps home broadband plan delivers up to 125 MB/s download speed. Most modern ethernet ports on laptops support 1 Gbps.
About Megabit (Mb)
A megabit (Mb or Mbit) equals 1,000,000 bits (1,000 kilobits) in the SI system. It is the standard unit for expressing broadband internet speeds and Wi-Fi throughput. Most internet service providers advertise download and upload speeds in megabits per second (Mbps). A 100 Mbps connection can theoretically download 100 megabits — about 12.5 megabytes — per second. Video streaming quality is also expressed in megabits: standard HD requires roughly 5 Mbps; 4K streaming requires 15–25 Mbps.
A 50 Mbps broadband plan delivers roughly 6.25 MB/s of download speed. Netflix recommends 15 Mbps for HD and 25 Mbps for 4K streaming.
Gigabit – Frequently Asked Questions
Is 1 Gbps internet fast enough for a household?
1 Gbps (gigabit) broadband delivers up to 125 MB/s, which is more than sufficient for most households. It supports dozens of simultaneous 4K streams, fast game downloads, and video conferencing with headroom to spare. The limiting factor is usually the Wi-Fi router (Wi-Fi 5 maxes out around 400–600 Mbps in practice) or the speed of the remote server you're downloading from.
What is a 10-gigabit network used for?
10 Gbps networking is standard in data centers, server interconnects, and high-performance workstations doing large file transfers (video editing, database backups). It is increasingly available in prosumer home networking equipment. At 10 Gbps, a 1 TB file transfer takes about 13 minutes under ideal conditions.
How many gigabits are in a terabit?
One terabit equals 1,000 gigabits (SI). Terabit-per-second (Tbps) speeds are used in long-haul fiber optic cables and internet backbone infrastructure. A single transatlantic fiber cable typically carries hundreds of terabits per second across many multiplexed channels.
How do Wi-Fi generations (Wi-Fi 5/6/6E/7) compare in gigabit throughput?
Wi-Fi 5 (802.11ac) delivers up to 3.5 Gbps theoretical, but typically 400–600 Mbps real-world on a single device. Wi-Fi 6 (802.11ax) reaches 9.6 Gbps theoretical and 600–900 Mbps practical per device, with better multi-device handling via OFDMA. Wi-Fi 6E extends the same technology into the uncongested 6 GHz band, improving real-world speeds to 1–2 Gbps. Wi-Fi 7 (802.11be) pushes the theoretical maximum to 46 Gbps using 320 MHz channels and 4096-QAM, with real-world single-device speeds expected around 2–5 Gbps — the first Wi-Fi standard to reliably exceed gigabit in practice.
Why do data centers use 100 Gbps and above?
Modern data centers handle enormous simultaneous traffic between thousands of servers — cloud computing, video streaming, and AI training all require massive internal bandwidth. 100 Gbps links between switches are now standard; 400 Gbps is increasingly deployed for spine connections. At these speeds, a single link can move 50 GB of data per second, keeping pace with NVMe storage arrays and GPU memory transfer rates.
Megabit – Frequently Asked Questions
How do I convert Mbps to MB/s?
Divide Mbps by 8 to get megabytes per second (MB/s). A 100 Mbps connection = 12.5 MB/s. A 1 Gbps connection = 125 MB/s. This conversion is essential when comparing advertised internet speeds (always in Mbps) to actual file download speeds (shown in MB/s by browsers and download managers).
What internet speed do I need for 4K streaming?
Netflix recommends 25 Mbps for 4K Ultra HD. Disney+ and Apple TV+ recommend 25 Mbps; YouTube recommends 20 Mbps for 4K. These are per-stream figures — a household streaming two 4K sources simultaneously needs roughly 50 Mbps of reliable throughput, plus headroom for other devices.
Why is my download speed slower than my advertised Mbps?
ISP speed ratings are theoretical maximums under ideal conditions. Real-world factors include network congestion, router quality, Wi-Fi interference, the server's upload speed, and protocol overhead. Additionally, browsers and download managers report speeds in MB/s (bytes), which is 8× smaller than the Mbps figure — a 100 Mbps plan showing 11 MB/s in a browser is performing normally.
How many megabits in a gigabit?
One gigabit equals 1,000 megabits (SI decimal system). Gigabit broadband (1 Gbps) = 1,000 Mbps = 125 MB/s theoretical download speed. In the binary IEC system, one gibibit = 1,024 mebibits — but for internet speeds the SI decimal values are always used.
How do fiber, cable, and DSL compare in real-world megabit throughput?
Fiber-to-the-home (FTTH) delivers symmetric speeds of 100–10,000 Mbps with consistent performance regardless of distance from the exchange. Cable (DOCSIS 3.1) offers 100–1,200 Mbps download but typically 10–50 Mbps upload, and throughput degrades during neighborhood peak hours due to shared bandwidth. DSL (VDSL2) maxes out at 50–100 Mbps download and drops sharply beyond 500 meters from the DSLAM cabinet. In practice, most cable users see 60–80% of advertised speeds; DSL users at distance may see under 50%. Fiber is the only technology that reliably delivers its rated megabit throughput.