Megabit per second to Byte per second

Mbps

1 Mbps

Bps

125,000 Bps

Conversion History

ConversionReuseDelete
No conversion history to show.

Entries per page:

0–0 of 0


Quick Reference Table (Megabit per second to Byte per second)

Megabit per second (Mbps)Byte per second (Bps)
1125,000
101,250,000
253,125,000
506,250,000
10012,500,000
30037,500,000
1,000125,000,000

About Megabit per second (Mbps)

A megabit per second (Mbps) equals 1,000,000 bits per second and is the dominant unit for describing home and business broadband speeds worldwide. ISPs universally advertise in Mbps — "100 Mbps fiber" or "1 Gbps" plans. Because bytes are 8 bits, a 100 Mbps connection delivers a maximum of 12.5 MB/s in a download manager. Streaming services specify minimum Mbps requirements: HD video typically needs 5–10 Mbps; 4K streaming 25 Mbps or more.

A typical home broadband connection in a developed country runs at 50–300 Mbps. Netflix recommends 25 Mbps for 4K Ultra HD streaming.

About Byte per second (Bps)

A byte per second (B/s or Bps) is the base byte-based unit of data transfer rate, equal to 8 bits per second. While ISPs advertise in bits per second, download managers, operating systems, and file transfer tools display speeds in bytes per second — a direct measure of how quickly usable file data arrives. The conversion between bits and bytes is constant: divide Mbps by 8 to get MB/s. At 1 B/s, transferring a 1 MB file would take about 11.5 days.

An old dial-up connection at 56 kbps delivered roughly 7,000 B/s (7 kB/s) of actual file data. USB 2.0 maxes out at about 60,000,000 B/s (60 MB/s).


Megabit per second – Frequently Asked Questions

Because ISPs advertise in megabits (Mb) while download managers show megabytes (MB). There are 8 bits in a byte, so 100 Mbps ÷ 8 = 12.5 MB/s. Your connection is working perfectly — it is just a unit mismatch that has confused people for decades.

Netflix recommends 25 Mbps for 4K, YouTube suggests 20 Mbps, and Apple TV+ needs about 25 Mbps. In practice, 50 Mbps gives comfortable headroom for one 4K stream plus normal browsing. A household streaming on multiple devices simultaneously should aim for 100+ Mbps.

Wi-Fi shares bandwidth among all connected devices, loses throughput to interference from walls and other electronics, and uses half-duplex communication (it cannot send and receive simultaneously). A 300 Mbps Wi-Fi router might deliver 100–150 Mbps to a single device in practice, while Ethernet gives you the full rated speed.

Download Mbps measures data coming to you (streaming, browsing), while upload Mbps measures data you send (video calls, cloud backups). Most home connections are asymmetric — 100 Mbps down but only 10–20 Mbps up. Fiber-to-the-home plans increasingly offer symmetric speeds.

Surprisingly little — most online games use only 1–3 Mbps of bandwidth. What gamers actually need is low latency (ping), not high throughput. A 10 Mbps connection with 15ms ping will outperform a 500 Mbps connection with 100ms ping for gaming every time.

Byte per second – Frequently Asked Questions

Files are stored in bytes because CPUs address memory in byte-sized (8-bit) chunks — the smallest unit a program can read or write. Networks measure in bits because physical signals on a wire or fiber are serial: one bit at a time, clocked at a specific frequency. A 1 GHz signal produces 1 Gbps, not 1 GBps. The two worlds evolved independently and neither adopted the other's convention, leaving users to divide by 8 forever.

In modern computing, yes — a byte is universally 8 bits. Historically, some architectures used 6, 7, or 9-bit bytes, which is why the unambiguous term "octet" exists in networking standards. But for all practical bandwidth conversions today, 1 byte = 8 bits.

Network protocols add overhead — TCP headers, encryption (TLS), error correction, and packet framing all consume bandwidth without contributing to file data. A 100 Mbps connection might deliver 11 MB/s instead of the theoretical 12.5 MB/s because 10–15% goes to protocol overhead.

USB 3.0 has a theoretical maximum of 625 MB/s (5 Gbps ÷ 8), but real-world sustained transfers hit 300–400 MB/s due to protocol overhead and controller limitations. USB 3.2 Gen 2 doubles this to about 700–900 MB/s in practice.

The bit came first, coined by Claude Shannon in 1948. The byte was introduced at IBM in the mid-1950s by Werner Buchholz to describe the smallest addressable group of bits in the IBM Stretch computer. Originally it could be any size; the 8-bit byte became standard with the IBM System/360 in 1964.

© 2026 TopConverters.com. All rights reserved.