Megabyte to Byte
MB
B
Conversion History
| Conversion | Reuse | Delete |
|---|---|---|
1 MB (Megabyte) → 1000000 B (Byte) Just now |
Quick Reference Table (Megabyte to Byte)
| Megabyte (MB) | Byte (B) |
|---|---|
| 1 | 1,000,000 |
| 3 | 3,000,000 |
| 5 | 5,000,000 |
| 10 | 10,000,000 |
| 50 | 50,000,000 |
| 100 | 100,000,000 |
| 700 | 700,000,000 |
About Megabyte (MB)
A megabyte (MB) equals 1,000,000 bytes (10⁶ bytes) in the SI decimal system. It is the standard unit for file sizes in everyday computing: digital photos (2–8 MB), MP3 audio files (3–10 MB), and small software applications. Network data usage on mobile plans was once tracked in megabytes; today gigabytes are more common. A megabyte holds approximately one million characters of text — about 500 pages of an average novel. The binary equivalent, the mebibyte (MiB = 1,048,576 bytes), is used internally by operating systems and differs from the decimal MB by about 4.9%.
A typical JPEG photo from a smartphone is 3–6 MB. A 3-minute MP3 song at 128 kbps is about 2.8 MB. A Microsoft Word document for a 20-page report is roughly 1–2 MB.
About Byte (B)
A byte (B) is a unit of digital information equal to 8 bits and is the fundamental unit of memory addressing in virtually all modern computer architectures. Characters, integers, pixels, and audio samples are all expressed in bytes or multiples thereof. The byte is the minimum addressable storage unit in most CPUs — even a single boolean value occupies a full byte of RAM. All file sizes, RAM capacities, and storage device capacities are expressed in bytes or their multiples (kilobytes, megabytes, gigabytes). The byte is to data storage what the meter is to distance — the practical base unit from which all others scale.
One byte stores a single ASCII text character (the letter "A" = byte value 65). A typical English word averages 5 bytes including the space. A 1,000-word article takes about 5 kilobytes.
Etymology: The term "byte" was coined by Werner Buchholz in 1956 at IBM during the design of the Stretch supercomputer. The deliberate misspelling (from "bite") was intended to prevent accidental abbreviation to "b", which was reserved for "bit".
Megabyte – Frequently Asked Questions
How many megabytes is a typical photo?
A JPEG photo from a modern smartphone is typically 3–8 MB depending on resolution and compression settings. A RAW format photo from a DSLR or mirrorless camera is 20–50 MB per shot. A PNG screenshot at full HD (1920×1080) is about 1–3 MB; a compressed JPEG screenshot may be under 200 kB.
How many megabytes does streaming video use?
Video data usage depends heavily on quality: SD video uses roughly 700 MB per hour; HD (1080p) uses 1.5–3 GB per hour; 4K uses 7–20 GB per hour. These are byte-based measurements. In terms of bitrate: SD ≈ 1.5 Mbps, HD ≈ 5–8 Mbps, 4K ≈ 15–25 Mbps — where the "b" is bits, requiring division by 8 to convert to MB/s.
How does file compression work and what are typical compression ratios in MB?
Compression algorithms like ZIP, GZIP, and ZSTD find and eliminate redundancy in data. Typical ratios vary dramatically by file type: plain text compresses to 20–30% of original size (a 10 MB log file becomes 2–3 MB); source code compresses to 25–35%; office documents (DOCX, XLSX) are already ZIP-compressed internally, so re-compressing gains little. JPEG, MP3, and H.264 video are already lossy-compressed and typically shrink by less than 5% with ZIP. A 100 MB folder of mixed files typically compresses to 40–60 MB. The key principle: compression removes statistical redundancy, so already-compressed or random data cannot be reduced further.
What is the difference between MB and MiB?
MB (megabyte) = 1,000,000 bytes (SI decimal). MiB (mebibyte) = 1,048,576 bytes (IEC binary). The difference is about 4.9%. Windows historically displayed storage in binary units but labelled them as "MB" — confusingly. Since Windows Vista, Microsoft has used the binary calculation consistently. macOS switched to SI decimal units in OS X 10.6 Snow Leopard (2009), matching the way hard drive manufacturers measure capacity.
How many megabytes of mobile data do common apps use?
Approximate data consumption per hour: web browsing = 60–100 MB, social media scrolling = 100–300 MB, music streaming (Spotify standard) = 40–50 MB, video calls (Zoom standard quality) = 300–500 MB, YouTube HD = 1,500–3,000 MB. These are rough averages and vary by content, settings, and network conditions.
Byte – Frequently Asked Questions
How many bits are in a byte?
A byte contains exactly 8 bits. This is the universal modern standard, though early computing used variable byte sizes (5, 6, or 7 bits). The 8-bit byte became universal with the IBM System/360 in 1964. Eight bits allow 256 possible values (0–255), sufficient to encode all ASCII characters with room for control codes.
Why is a byte 8 bits and not some other number?
Eight bits became standard because it is the smallest power of two that can encode all 128 ASCII characters (7 bits) with a spare bit for parity checking or extended character sets. It also maps cleanly to two hexadecimal digits (0x00–0xFF), making it convenient for low-level programming and hardware design. Earlier systems used 6-bit or 7-bit bytes; 8-bit won due to IBM's dominance in the 1960s–70s.
What is a nibble?
A nibble (also spelled nybble) is 4 bits — half a byte. A nibble represents exactly one hexadecimal digit (0–F). The term is used in low-level programming, embedded systems, and BCD (binary-coded decimal) encoding. It is not an SI unit and rarely appears in general computing contexts outside of hardware and systems programming.
How many bytes does a single Unicode character use?
It depends on the character and encoding. In UTF-8 (the dominant web encoding): ASCII characters (A–Z, 0–9) use 1 byte; common European accented characters use 2 bytes; most Asian scripts (Chinese, Japanese, Korean) use 3 bytes; emoji and rare characters use 4 bytes. A plain English text file is efficiently encoded as 1 byte per character in UTF-8.
What is the difference between byte and octet?
In most modern usage, byte and octet are synonymous — both mean 8 bits. "Octet" is preferred in networking standards (RFC documents, ITU specifications) to avoid ambiguity from early computing where byte sizes varied. Internet protocol headers are specified in octets; operating systems and storage devices use bytes. In practice you will encounter "octet" mainly in formal networking documentation.