Megabyte per second to Byte per second
MBps
Bps
Conversion History
| Conversion | Reuse | Delete |
|---|---|---|
| No conversion history to show. | ||
Quick Reference Table (Megabyte per second to Byte per second)
| Megabyte per second (MBps) | Byte per second (Bps) |
|---|---|
| 1 | 1,000,000 |
| 12.5 | 12,500,000 |
| 50 | 50,000,000 |
| 100 | 100,000,000 |
| 500 | 500,000,000 |
| 1,000 | 1,000,000,000 |
| 7,000 | 7,000,000,000 |
About Megabyte per second (MBps)
A megabyte per second (MB/s or MBps) equals 8,000,000 bits per second and is the practical unit that most users encounter when watching a download progress bar. A 100 Mbps broadband connection downloads at up to 12.5 MB/s; a USB 3.0 drive transfers at 50–100 MB/s; an NVMe SSD reads at 3,000–7,000 MB/s. Understanding MB/s alongside Mbps resolves the common frustration of seeing a "1 Gbps" plan deliver "only" 125 MB/s — the two figures are consistent, not contradictory.
A 100 Mbps home broadband plan delivers up to 12.5 MB/s in a download manager. A USB 3.2 flash drive typically writes at 50–200 MB/s.
About Byte per second (Bps)
A byte per second (B/s or Bps) is the base byte-based unit of data transfer rate, equal to 8 bits per second. While ISPs advertise in bits per second, download managers, operating systems, and file transfer tools display speeds in bytes per second — a direct measure of how quickly usable file data arrives. The conversion between bits and bytes is constant: divide Mbps by 8 to get MB/s. At 1 B/s, transferring a 1 MB file would take about 11.5 days.
An old dial-up connection at 56 kbps delivered roughly 7,000 B/s (7 kB/s) of actual file data. USB 2.0 maxes out at about 60,000,000 B/s (60 MB/s).
Megabyte per second – Frequently Asked Questions
Why does copying files to a USB drive slow down partway through?
Many USB drives use a small SLC cache for initial writes at high MB/s, then slow dramatically once the cache fills and data writes to slower TLC/QLC NAND. A drive that starts at 200 MB/s might drop to 20–30 MB/s after the first few gigabytes. Check sustained write speed reviews, not just peak numbers.
What MB/s do I need for video editing?
Editing 4K ProRes footage requires about 200–400 MB/s of sustained read speed. 8K RAW can demand 1,000+ MB/s. A SATA SSD (550 MB/s) handles 4K fine, but 8K workflows really need NVMe drives at 3,000+ MB/s. The timeline scrubbing experience directly correlates with MB/s.
How do I tell if a speed test result is in MB/s or Mbps?
Look at the capitalisation: lowercase "b" (Mbps) means megabits, uppercase "B" (MB/s) means megabytes. Most speed test websites (Speedtest by Ookla, fast.com) default to Mbps. If your result seems 8× lower than expected, you are probably reading MB/s where you expected Mbps.
What is the fastest consumer storage in MB/s as of 2026?
PCIe 5.0 NVMe SSDs hit 12,000–14,000 MB/s sequential read speeds. That is fast enough to load an entire 50 GB game in about 4 seconds. PCIe 6.0 drives, expected soon, will double this again to roughly 25,000 MB/s.
Why is network file sharing slower than local disk speed?
Network transfers add latency, protocol overhead (SMB, NFS), and are limited by the network link speed. A file on a local NVMe SSD reads at 7,000 MB/s, but sharing it over a 1 Gbps network caps throughput at 125 MB/s. Even 10 GbE only gives 1,250 MB/s — a fraction of modern SSD capability.
Byte per second – Frequently Asked Questions
Why is a byte the fundamental unit of file storage but not of network speed?
Files are stored in bytes because CPUs address memory in byte-sized (8-bit) chunks — the smallest unit a program can read or write. Networks measure in bits because physical signals on a wire or fiber are serial: one bit at a time, clocked at a specific frequency. A 1 GHz signal produces 1 Gbps, not 1 GBps. The two worlds evolved independently and neither adopted the other's convention, leaving users to divide by 8 forever.
Is a byte always 8 bits?
In modern computing, yes — a byte is universally 8 bits. Historically, some architectures used 6, 7, or 9-bit bytes, which is why the unambiguous term "octet" exists in networking standards. But for all practical bandwidth conversions today, 1 byte = 8 bits.
Why is actual file download speed always less than the connection speed in bytes?
Network protocols add overhead — TCP headers, encryption (TLS), error correction, and packet framing all consume bandwidth without contributing to file data. A 100 Mbps connection might deliver 11 MB/s instead of the theoretical 12.5 MB/s because 10–15% goes to protocol overhead.
How many bytes per second does USB 3.0 actually transfer?
USB 3.0 has a theoretical maximum of 625 MB/s (5 Gbps ÷ 8), but real-world sustained transfers hit 300–400 MB/s due to protocol overhead and controller limitations. USB 3.2 Gen 2 doubles this to about 700–900 MB/s in practice.
What came first — the bit or the byte?
The bit came first, coined by Claude Shannon in 1948. The byte was introduced at IBM in the mid-1950s by Werner Buchholz to describe the smallest addressable group of bits in the IBM Stretch computer. Originally it could be any size; the 8-bit byte became standard with the IBM System/360 in 1964.