Kilobit per second to Byte per second
Kbps
Bps
Conversion History
| Conversion | Reuse | Delete |
|---|---|---|
1 Kbps (Kilobit per second) → 125 Bps (Byte per second) Just now |
Quick Reference Table (Kilobit per second to Byte per second)
| Kilobit per second (Kbps) | Byte per second (Bps) |
|---|---|
| 1 | 125 |
| 28 | 3,500 |
| 56 | 7,000 |
| 128 | 16,000 |
| 256 | 32,000 |
| 512 | 64,000 |
| 1,000 | 125,000 |
About Kilobit per second (Kbps)
A kilobit per second (kbps or kb/s) equals 1,000 bits per second in the SI decimal system. It was the standard unit for dial-up modem speeds throughout the 1990s — 28.8 kbps and 56 kbps modems defined home internet access for a generation. Today kbps persists in audio codec specifications: MP3 files are typically encoded at 128–320 kbps, and voice calls over IP use 8–64 kbps codecs. DSL connections still quote upstream speeds in the low hundreds of kbps for basic plans.
A 56 kbps dial-up modem could transfer about 7 kB per second — downloading a 1 MB image took around two minutes. An MP3 at 128 kbps uses 1 MB per minute of audio.
About Byte per second (Bps)
A byte per second (B/s or Bps) is the base byte-based unit of data transfer rate, equal to 8 bits per second. While ISPs advertise in bits per second, download managers, operating systems, and file transfer tools display speeds in bytes per second — a direct measure of how quickly usable file data arrives. The conversion between bits and bytes is constant: divide Mbps by 8 to get MB/s. At 1 B/s, transferring a 1 MB file would take about 11.5 days.
An old dial-up connection at 56 kbps delivered roughly 7,000 B/s (7 kB/s) of actual file data. USB 2.0 maxes out at about 60,000,000 B/s (60 MB/s).
Kilobit per second – Frequently Asked Questions
Why are MP3 bitrates measured in kbps?
Audio codecs compress sound into a stream of bits played back in real time, so the natural unit is bits per second. At 128 kbps, an MP3 encoder allocates 128,000 bits to represent each second of audio. Higher kbps means more data per second, better quality, and larger files.
Can you still use a 56 kbps dial-up connection in 2026?
Technically yes — dial-up ISPs like NetZero still exist in the US, and some rural areas with no broadband rely on them. But at 56 kbps, loading a modern webpage (average 2.5 MB) would take over 5 minutes. It is functionally unusable for anything beyond basic email.
What is the difference between 128 kbps and 320 kbps MP3?
At 128 kbps, the encoder discards more audio detail — cymbals sound washy, stereo imaging narrows, and quiet passages lose nuance. At 320 kbps, most listeners cannot distinguish the MP3 from the original CD in blind tests. The file is 2.5× larger but audibly transparent to most ears.
How many kbps does a phone call use?
A standard VoIP call uses 8–64 kbps depending on the codec. The widely used Opus codec delivers excellent voice quality at 16–32 kbps. Traditional landline phone calls used 64 kbps (G.711 codec). HD Voice on modern smartphones uses about 32 kbps with the AMR-WB codec.
Why did dial-up internet make that screeching noise?
The screeching was the modem handshake — two modems negotiating their connection speed by exchanging test tones over the phone line. Each phase of the screech tested different frequencies and protocols. The modems were literally talking to each other in audio, finding the fastest kbps rate the line could support.
Byte per second – Frequently Asked Questions
Why is a byte the fundamental unit of file storage but not of network speed?
Files are stored in bytes because CPUs address memory in byte-sized (8-bit) chunks — the smallest unit a program can read or write. Networks measure in bits because physical signals on a wire or fiber are serial: one bit at a time, clocked at a specific frequency. A 1 GHz signal produces 1 Gbps, not 1 GBps. The two worlds evolved independently and neither adopted the other's convention, leaving users to divide by 8 forever.
Is a byte always 8 bits?
In modern computing, yes — a byte is universally 8 bits. Historically, some architectures used 6, 7, or 9-bit bytes, which is why the unambiguous term "octet" exists in networking standards. But for all practical bandwidth conversions today, 1 byte = 8 bits.
Why is actual file download speed always less than the connection speed in bytes?
Network protocols add overhead — TCP headers, encryption (TLS), error correction, and packet framing all consume bandwidth without contributing to file data. A 100 Mbps connection might deliver 11 MB/s instead of the theoretical 12.5 MB/s because 10–15% goes to protocol overhead.
How many bytes per second does USB 3.0 actually transfer?
USB 3.0 has a theoretical maximum of 625 MB/s (5 Gbps ÷ 8), but real-world sustained transfers hit 300–400 MB/s due to protocol overhead and controller limitations. USB 3.2 Gen 2 doubles this to about 700–900 MB/s in practice.
What came first — the bit or the byte?
The bit came first, coined by Claude Shannon in 1948. The byte was introduced at IBM in the mid-1950s by Werner Buchholz to describe the smallest addressable group of bits in the IBM Stretch computer. Originally it could be any size; the 8-bit byte became standard with the IBM System/360 in 1964.