Byte per second to Bit per second

Bps

1 Bps

bps

8 bps

Conversion History

ConversionReuseDelete
No conversion history to show.

Entries per page:

0–0 of 0


Quick Reference Table (Byte per second to Bit per second)

Byte per second (Bps)Bit per second (bps)
18
100800
7,00056,000
125,0001,000,000
1,000,0008,000,000
12,500,000100,000,000

About Byte per second (Bps)

A byte per second (B/s or Bps) is the base byte-based unit of data transfer rate, equal to 8 bits per second. While ISPs advertise in bits per second, download managers, operating systems, and file transfer tools display speeds in bytes per second — a direct measure of how quickly usable file data arrives. The conversion between bits and bytes is constant: divide Mbps by 8 to get MB/s. At 1 B/s, transferring a 1 MB file would take about 11.5 days.

An old dial-up connection at 56 kbps delivered roughly 7,000 B/s (7 kB/s) of actual file data. USB 2.0 maxes out at about 60,000,000 B/s (60 MB/s).

About Bit per second (bps)

A bit per second (bps) is the base unit of data transfer rate, representing one binary digit transmitted every second. It is the foundation from which all larger bandwidth units are built. In practice, raw bps figures are useful only for extremely low-speed links — early telegraph systems, narrowband IoT sensors, and some serial control lines operate at tens to thousands of bps. Modern connections are described in kbps, Mbps, or Gbps, making raw bps a reference unit rather than a practical measurement for everyday networking.

Early Morse code telegraph lines transmitted at roughly 10–50 bps. Modern IoT sensors on LoRaWAN networks communicate at 250–50,000 bps.


Byte per second – Frequently Asked Questions

Files are stored in bytes because CPUs address memory in byte-sized (8-bit) chunks — the smallest unit a program can read or write. Networks measure in bits because physical signals on a wire or fiber are serial: one bit at a time, clocked at a specific frequency. A 1 GHz signal produces 1 Gbps, not 1 GBps. The two worlds evolved independently and neither adopted the other's convention, leaving users to divide by 8 forever.

In modern computing, yes — a byte is universally 8 bits. Historically, some architectures used 6, 7, or 9-bit bytes, which is why the unambiguous term "octet" exists in networking standards. But for all practical bandwidth conversions today, 1 byte = 8 bits.

Network protocols add overhead — TCP headers, encryption (TLS), error correction, and packet framing all consume bandwidth without contributing to file data. A 100 Mbps connection might deliver 11 MB/s instead of the theoretical 12.5 MB/s because 10–15% goes to protocol overhead.

USB 3.0 has a theoretical maximum of 625 MB/s (5 Gbps ÷ 8), but real-world sustained transfers hit 300–400 MB/s due to protocol overhead and controller limitations. USB 3.2 Gen 2 doubles this to about 700–900 MB/s in practice.

The bit came first, coined by Claude Shannon in 1948. The byte was introduced at IBM in the mid-1950s by Werner Buchholz to describe the smallest addressable group of bits in the IBM Stretch computer. Originally it could be any size; the 8-bit byte became standard with the IBM System/360 in 1964.

Bit per second – Frequently Asked Questions

A bit represents a single binary choice — 0 or 1 — which is the fundamental quantum of digital information. Every larger unit (byte, kilobit, megabit) is just a multiple of bits. You cannot meaningfully subdivide a binary digit, so bps is the floor of data rate measurement.

LoRaWAN IoT sensors, some RFID readers, and legacy serial ports (RS-232 at 300–9600 baud) still deal in raw bps ranges. Satellites communicating with deep-space probes also use very low bps — NASA's Voyager 1 transmits at about 160 bps from interstellar space.

Not exactly. Baud measures symbol changes per second, while bps measures bits per second. If each symbol encodes one bit, they are equal. But modern modems encode multiple bits per symbol — a 2400-baud modem using 16-QAM transmits 9600 bps because each symbol carries 4 bits.

Research suggests human speech carries about 39 bits per second of actual information content, regardless of language. Italian speakers talk faster but convey less information per syllable than Japanese speakers, balancing out to roughly the same bps across all studied languages.

The 56 kbps limit came from the Shannon-Hartley theorem applied to analogue phone lines. The 3.1 kHz bandwidth of a voice telephone channel, combined with its signal-to-noise ratio, creates a theoretical ceiling near 56 kbps. FCC power regulations further capped actual downstream to 53.3 kbps.

© 2026 TopConverters.com. All rights reserved.