Byte per second to Gigabit per second
Bps
Gbps
Conversion History
| Conversion | Reuse | Delete |
|---|---|---|
| No conversion history to show. | ||
Quick Reference Table (Byte per second to Gigabit per second)
| Byte per second (Bps) | Gigabit per second (Gbps) |
|---|---|
| 1 | 0.000000008 |
| 100 | 0.0000008 |
| 7,000 | 0.000056 |
| 125,000 | 0.001 |
| 1,000,000 | 0.008 |
| 12,500,000 | 0.1 |
About Byte per second (Bps)
A byte per second (B/s or Bps) is the base byte-based unit of data transfer rate, equal to 8 bits per second. While ISPs advertise in bits per second, download managers, operating systems, and file transfer tools display speeds in bytes per second — a direct measure of how quickly usable file data arrives. The conversion between bits and bytes is constant: divide Mbps by 8 to get MB/s. At 1 B/s, transferring a 1 MB file would take about 11.5 days.
An old dial-up connection at 56 kbps delivered roughly 7,000 B/s (7 kB/s) of actual file data. USB 2.0 maxes out at about 60,000,000 B/s (60 MB/s).
About Gigabit per second (Gbps)
A gigabit per second (Gbps) equals 1,000 Mbps and represents the current frontier of consumer and enterprise networking. Gigabit fiber broadband (1 Gbps) is now available to millions of homes in the US, South Korea, Japan, and parts of Europe. Data center interconnects, server network cards, and backbone routers operate at 10, 25, 40, or 100 Gbps. At 1 Gbps, a full HD film (8 GB) downloads in about 64 seconds; at 10 Gbps it takes under 7 seconds.
A 1 Gbps fiber broadband connection delivers up to 125 MB/s download speed. A modern NVMe SSD reads data at 3–7 Gbps internally.
Byte per second – Frequently Asked Questions
Why is a byte the fundamental unit of file storage but not of network speed?
Files are stored in bytes because CPUs address memory in byte-sized (8-bit) chunks — the smallest unit a program can read or write. Networks measure in bits because physical signals on a wire or fiber are serial: one bit at a time, clocked at a specific frequency. A 1 GHz signal produces 1 Gbps, not 1 GBps. The two worlds evolved independently and neither adopted the other's convention, leaving users to divide by 8 forever.
Is a byte always 8 bits?
In modern computing, yes — a byte is universally 8 bits. Historically, some architectures used 6, 7, or 9-bit bytes, which is why the unambiguous term "octet" exists in networking standards. But for all practical bandwidth conversions today, 1 byte = 8 bits.
Why is actual file download speed always less than the connection speed in bytes?
Network protocols add overhead — TCP headers, encryption (TLS), error correction, and packet framing all consume bandwidth without contributing to file data. A 100 Mbps connection might deliver 11 MB/s instead of the theoretical 12.5 MB/s because 10–15% goes to protocol overhead.
How many bytes per second does USB 3.0 actually transfer?
USB 3.0 has a theoretical maximum of 625 MB/s (5 Gbps ÷ 8), but real-world sustained transfers hit 300–400 MB/s due to protocol overhead and controller limitations. USB 3.2 Gen 2 doubles this to about 700–900 MB/s in practice.
What came first — the bit or the byte?
The bit came first, coined by Claude Shannon in 1948. The byte was introduced at IBM in the mid-1950s by Werner Buchholz to describe the smallest addressable group of bits in the IBM Stretch computer. Originally it could be any size; the 8-bit byte became standard with the IBM System/360 in 1964.
Gigabit per second – Frequently Asked Questions
Do I actually need gigabit internet at home?
For most households, no. A family of four streaming 4K, gaming, and video-calling simultaneously uses about 100–150 Mbps. Gigabit becomes worthwhile if you regularly transfer large files, run a home server, or have 15+ connected devices all active at once. The real benefit is future-proofing.
What is the difference between dedicated and shared bandwidth in fiber plans?
Dedicated bandwidth means your 1 Gbps line is yours alone — common in business fiber (leased lines). Residential fiber is shared: a 10 Gbps trunk splits across 32–128 homes via a passive optical splitter (GPON). During peak evening hours, your "gigabit" plan might deliver 300–600 Mbps because neighbors are all streaming. This is why business fiber costs 5–10× more for the same headline speed — you are paying for a guarantee, not just capacity.
What is the fastest internet speed available to consumers?
As of 2026, several ISPs offer 10 Gbps residential plans in select cities — Google Fiber, AT&T, and some European providers. South Korea and Japan have had multi-gigabit home connections since the early 2020s. The bottleneck is usually the home network equipment, not the ISP connection.
How does a data center use 100 Gbps connections?
Data centers connect racks of servers with 25–100 Gbps links to handle millions of simultaneous user requests. A single popular website might serve hundreds of Gbps of traffic during peak hours. Spine-leaf network architectures aggregate these links to provide non-blocking Tbps-class switching capacity.
Can my hard drive even write fast enough to use gigabit internet?
A traditional spinning hard drive writes at about 1–1.5 Gbps (125–180 MB/s), so it can just barely keep up with a 1 Gbps connection. An NVMe SSD at 3–7 Gbps handles it easily. If you have gigabit internet but an old HDD, your disk is the bottleneck, not your connection.