Coulomb per second to EMU of current
C/s
EMU
Conversion History
| Conversion | Reuse | Delete |
|---|---|---|
1 C/s (Coulomb per second) → 0.1 EMU (EMU of current) Just now |
Quick Reference Table (Coulomb per second to EMU of current)
| Coulomb per second (C/s) | EMU of current (EMU) |
|---|---|
| 0.1 | 0.01 |
| 1 | 0.1 |
| 5 | 0.5 |
| 10 | 1 |
| 20 | 2 |
| 100 | 10 |
About Coulomb per second (C/s)
The coulomb per second (C/s) is a derived SI expression for electric current that makes the physical definition explicit: one ampere is exactly one coulomb of charge passing a point per second. The relationship I = Q/t links current (A), charge (C), and time (s). While C/s and A are numerically identical and dimensionally equivalent, the C/s form appears in physics textbooks and dimensional analyses where the derivation from charge and time is instructive rather than treating the ampere as primitive. In calculations tracking charge accumulation — capacitor discharge, electroplating, or battery coulomb-counting — expressing current in C/s clarifies the unit chain.
A capacitor delivering 1 C of charge over 1 second discharges at exactly 1 C/s = 1 A. A 500 mA USB charger transfers 0.5 C of charge each second.
About EMU of current (EMU)
The electromagnetic unit (EMU) of current equals exactly 10 amperes, numerically identical to the biot. It is the current unit native to the CGS electromagnetic (CGS-EMU) system, which dominated electrical physics from the mid-19th century until SI adoption in 1960. In CGS-EMU, the permeability of free space is defined as 1, giving the electromagnetic subsystem its characteristic form where magnetic force between parallel currents is expressed purely in dynes. The EMU of current appears in classical electrodynamics texts, historical measurement standards, and theoretical physics work using CGS-EMU conventions. All practical electrical measurement now uses SI amperes.
1 EMU of current = 10 A. A 50 A arc welding process carries 5 EMU. The unit is encountered primarily in pre-1960 scientific literature.
Coulomb per second – Frequently Asked Questions
Why bother writing coulombs per second when it is just amperes?
In dimensional analysis and physics derivations, C/s makes the relationship between charge and current explicit. When you are computing how much silver an electroplating bath deposits (Faraday's law), writing current as C/s reminds you that charge = current × time, which directly gives the mass deposited.
How many electrons is one coulomb?
One coulomb is approximately 6.242 × 10¹⁸ electrons — about 6.2 quintillion. At 1 C/s (1 A), that many electrons pass a point in your wire every single second. A USB cable charging your phone at 2 A carries 12.5 quintillion electrons per second. The numbers are staggering but the charges are tiny.
Is coulombs per second used in any real-world instrument or specification?
Not directly — every instrument reads in amperes or milliamperes. But coulomb-counting battery fuel gauges internally track charge in coulombs by integrating current over time: ∫I dt. The C/s framing appears in battery management system firmware and electrochemistry literature where charge balance matters.
How does Faraday's law of electrolysis use coulombs to predict metal deposition?
Faraday discovered that the mass of metal deposited at an electrode is directly proportional to the total charge passed (in coulombs). For silver, 107.87 grams deposit per 96,485 C (one Faraday). So a 10 A electroplating bath running for 1 hour passes 36,000 C and deposits about 40 g of silver. Thinking in C/s makes the calculation: current × time × atomic weight / (valence × 96,485).
How does coulomb counting work in battery management systems?
A shunt resistor or Hall sensor continuously measures current flowing in and out of the battery. The BMS integrates this current over time (summing C/s × Δt) to track net charge. Drift and measurement errors accumulate, so smart BMS designs periodically recalibrate against voltage-based state-of-charge estimates.
EMU of current – Frequently Asked Questions
What does EMU stand for and why was it created?
EMU stands for "electromagnetic unit." In the 1860s–1870s, physicists needed separate unit systems for electrostatic and electromagnetic phenomena because they had not yet unified them. The EMU system was built around magnetic force between currents, while the ESU system was built around Coulomb's electrostatic force. The ratio between them turned out to be the speed of light — a clue that led to Maxwell's equations.
Is the EMU of current the same as a biot?
Yes, exactly. Both equal 10 amperes. The biot is the named unit; "EMU of current" is the generic label. It is like saying "SI unit of force" versus "newton" — same thing, different label. The CGS-EMU system also has named units for other quantities: the gauss (magnetic field), the oersted (magnetising field), and the maxwell (magnetic flux).
Why did physics abandon the EMU system?
The EMU system was awkward for practical electrical engineering — 1 EMU of resistance (the abohm) equals 10⁻⁹ ohms, making everyday values absurdly large numbers. The SI system, adopted in 1960, unified mechanical and electrical units into one coherent framework with human-scale values. Practicality won over tradition.
Where might I encounter EMU of current in old scientific papers?
Pre-1960 physics journals, particularly in geomagnetism, plasma physics, and early electrical standards work, routinely use EMU. Geophysicists measuring Earth's magnetic field historically reported results in CGS-EMU units (gauss, oersted, EMU). Some geophysics reference data still has not been converted to SI.
How did the speed of light connect the EMU and ESU systems?
Weber and Kohlrausch discovered in 1856 that the ratio of the ESU to EMU charge was approximately 3×10¹⁰ cm/s — the speed of light. This was no coincidence: Maxwell showed that light is an electromagnetic wave, and the unit ratio reflects the fundamental coupling between electric and magnetic fields. One of the greatest insights in physics history, hidden in a unit conversion.