Megabit per second to Gibibyte per second
Mbps
GiB/s
Conversion History
| Conversion | Reuse | Delete |
|---|---|---|
| No conversion history to show. | ||
Quick Reference Table (Megabit per second to Gibibyte per second)
| Megabit per second (Mbps) | Gibibyte per second (GiB/s) |
|---|---|
| 1 | 0.00011641532182693481 |
| 10 | 0.00116415321826934814 |
| 25 | 0.00291038304567337036 |
| 50 | 0.00582076609134674072 |
| 100 | 0.01164153218269348145 |
| 300 | 0.03492459654808044434 |
| 1,000 | 0.11641532182693481445 |
About Megabit per second (Mbps)
A megabit per second (Mbps) equals 1,000,000 bits per second and is the dominant unit for describing home and business broadband speeds worldwide. ISPs universally advertise in Mbps — "100 Mbps fiber" or "1 Gbps" plans. Because bytes are 8 bits, a 100 Mbps connection delivers a maximum of 12.5 MB/s in a download manager. Streaming services specify minimum Mbps requirements: HD video typically needs 5–10 Mbps; 4K streaming 25 Mbps or more.
A typical home broadband connection in a developed country runs at 50–300 Mbps. Netflix recommends 25 Mbps for 4K Ultra HD streaming.
About Gibibyte per second (GiB/s)
A gibibyte per second (GiB/s) equals 1,073,741,824 bytes per second and is used in high-performance storage and memory bandwidth measurements when binary precision is required. GPU memory bandwidth figures in technical documentation sometimes appear in GiB/s — an NVIDIA RTX 4090 features 1,008 GiB/s of GDDR6X memory bandwidth. NVMe SSD sequential read speeds are often reported as both GB/s (decimal) and GiB/s (binary) in reviews and datasheets.
The NVIDIA RTX 4090 GPU has 1,008 GiB/s of memory bandwidth (~1,082 GB/s in decimal). DDR5-6400 dual-channel memory provides about 100 GiB/s.
Megabit per second – Frequently Asked Questions
Why does my 100 Mbps internet only download at 12 MB/s?
Because ISPs advertise in megabits (Mb) while download managers show megabytes (MB). There are 8 bits in a byte, so 100 Mbps ÷ 8 = 12.5 MB/s. Your connection is working perfectly — it is just a unit mismatch that has confused people for decades.
How many Mbps do I need for streaming 4K video?
Netflix recommends 25 Mbps for 4K, YouTube suggests 20 Mbps, and Apple TV+ needs about 25 Mbps. In practice, 50 Mbps gives comfortable headroom for one 4K stream plus normal browsing. A household streaming on multiple devices simultaneously should aim for 100+ Mbps.
Why is my Wi-Fi speed lower than my wired Ethernet speed?
Wi-Fi shares bandwidth among all connected devices, loses throughput to interference from walls and other electronics, and uses half-duplex communication (it cannot send and receive simultaneously). A 300 Mbps Wi-Fi router might deliver 100–150 Mbps to a single device in practice, while Ethernet gives you the full rated speed.
What is the difference between download and upload Mbps?
Download Mbps measures data coming to you (streaming, browsing), while upload Mbps measures data you send (video calls, cloud backups). Most home connections are asymmetric — 100 Mbps down but only 10–20 Mbps up. Fiber-to-the-home plans increasingly offer symmetric speeds.
How many Mbps does online gaming actually need?
Surprisingly little — most online games use only 1–3 Mbps of bandwidth. What gamers actually need is low latency (ping), not high throughput. A 10 Mbps connection with 15ms ping will outperform a 500 Mbps connection with 100ms ping for gaming every time.
Gibibyte per second – Frequently Asked Questions
Why do GPU specs sometimes use GiB/s instead of GB/s?
GPU memory is addressed in binary (power-of-2 bus widths like 256-bit or 384-bit), so binary units naturally describe the actual hardware capability. Some vendors use GiB/s to be precise, while marketing materials prefer the larger-sounding GB/s number. The RTX 4090's 1,008 GiB/s is 1,082 GB/s — the latter sounds faster.
How much GiB/s bandwidth does DDR5 RAM provide?
DDR5-6000 in dual-channel mode provides about 93 GiB/s (100 GB/s). Quad-channel DDR5 on workstation platforms doubles this to ~186 GiB/s. The actual usable bandwidth depends on memory access patterns — random access achieves far less than sequential streaming.
What is the difference between memory bandwidth and storage bandwidth?
Memory bandwidth (50–100+ GiB/s for DDR5) measures how fast the CPU can read/write RAM. Storage bandwidth (3–14 GiB/s for NVMe SSDs) measures persistent data transfer. Memory is 10–30× faster because DRAM has nanosecond latency while NAND flash has microsecond latency. They serve different roles in the data hierarchy.
Can I measure GiB/s bandwidth on my own system?
Yes. For memory bandwidth, run a STREAM benchmark (available for Linux and Windows). For storage, use fio or CrystalDiskMark. GPU memory bandwidth can be tested with gpu-burn or vendor-provided tools. All will report in either GiB/s or GB/s depending on the tool — check which one.
At what GiB/s does data transfer become limited by physics?
Electrical signalling on copper traces maxes out around 112 Gbps (about 13 GiB/s) per lane with current technology. Beyond that, optics take over — silicon photonics interconnects can push individual channels to 200+ Gbps. The physical speed of light in fiber is not the limit; it is the modulation and detection electronics.