83
$\begingroup$

Is there a physical reason behind the frequency and voltage in the mains electricity? I do not want to know why exactly a certain value was chosen; I am rather interested to know why that range/order of magnitude was selected. I.e., why 50 Hz and not 50000 Hz or 0.005 Hz?

For example, is 50 Hz the actual frequency at which a turbine rotates, and is it not practical to build one that rotates much faster or slower?

$\endgroup$
22
  • $\begingroup$ Lost of information here. $\endgroup$
    – DanielSank
    Commented Oct 7, 2015 at 17:25
  • $\begingroup$ I think I have a solution - I've edited the question accordingly (and taken the opportunity to make it internationally relevant while I was at it). SuperCiocia, if you don't like where this is going, feel free to change it back, but I think this will help stave off the objections (which I would have agreed with) about this not being a physics question. $\endgroup$
    – David Z
    Commented Oct 7, 2015 at 19:46
  • $\begingroup$ @DavidZ I'd like to rollback (but I cannot) here's why: By saying 220 (or 230, or 240, or whatever it is) answers shall (hopefully) contrast with "why US supplies don't have an earth pin" and there's a good opportunity for a very informative answer contrasting some of the "subjective optima" different nations went for. $\endgroup$
    – Alec Teal
    Commented Oct 7, 2015 at 19:48
  • 1
    $\begingroup$ @DavidZ what? Closed questions and closing questions are unrelated. A closed question has a "yes or no" answer. $\endgroup$
    – Alec Teal
    Commented Oct 7, 2015 at 20:10
  • 1
    $\begingroup$ The main reason is to avoid visible flickering of light. Another reason is transformer-cores... Lower frequency, would require more iron and thus larger transformers. With a higher frequencies, everything electrical would make a (more) audible hum (50Hz is at the limit of the human ear). Transformers would also have problems with hanging on (would require special construction and/or material - eg. ferrites). TBC $\endgroup$ Commented Oct 8, 2015 at 10:21

6 Answers 6

152
$\begingroup$

Why is mains frequency 50Hz and not 500 or 5?

Engine efficiency, rotational stress, flicker, the skin effect, and the limitations of 19th century material engineering.

50Hz corresponds to 3000 RPM. That range is a convenient, efficient speed for the steam turbine engines which power most generators and thus avoids a lot of extra gearing.

3000 RPM is also a fast, but doesn't put too much mechanical stress on the rotating turbine nor AC generator. 500Hz would be 30,000 RPM and at that speed your generator would likely tear itself apart. Here's what happens when you spin a CD at that speed, and for funsies at 62,000 FPS and 170,000 FPS.

Why not slower? Flicker. Even at 40Hz an incandescent bulb cools slightly on each half cycle reducing brightness and producing a noticeable flicker. Transformer and motor size is also directly proportional to frequency, higher frequency means smaller transformers and motors.

Finally there is the skin effect. At higher frequencies AC power tends to travel at the surface of a conductor. This reduces the effective cross-section of the conductor and increases its resistance causing more heating and power loss. There are ways to mitigate this effect, and they're used in high tension wires, but they are more expensive and so are avoided in home wiring.

Could we do it differently today? Probably. But these standards were laid down in the late 19th century and they were convenient and economical for the electrical and material knowledge of the time.

Some systems do run at an order of magnitude higher frequency than 50Hz. Many enclosed systems such as ships, computer server farms, and aircraft use 400 Hz. They have their own generator, so the transmission loss due to the higher frequency is of less consequence. At higher frequencies transformers and motors can be made smaller and lighter, of great consequence in an enclosed space.

Why is mains voltage 110-240V and not 10V or 2000V?

Higher voltage means lower current for the same power. Lower current means less loss due to resistance. So you want to get your voltage as high as possible for efficient power distribution and less heating with thinner (and cheaper) wires. For this reason, power is often distributed over long distances in dozens to hundreds of kilovolts.

Why isn't it lower? AC power is directly related to its voltage. AC power at 10 volts would have trouble running your higher energy household appliances like lights, heating or refrigerator compressor motor. At the time this was being developed, the voltage choice was a compromise between the voltage to run lights, motors and appliances.

Why isn't it higher? Insulation and safety. High voltage AC wires need additional insulation to make them both safe to touch and to avoid interference with other wiring or radio receivers. Cost of home wiring was a major concern in the early adoption of electricity. Higher voltages would make home wiring more bulky, expensive and more dangerous.

$\endgroup$
9
  • 4
    $\begingroup$ Great concise answer. The point about 500Hz destroying a spinning generator is really nice. $\endgroup$
    – DanielSank
    Commented Oct 7, 2015 at 18:34
  • 2
    $\begingroup$ Can't you just up the number of poles in your motor to "gear it down"? I can't imagine large industrial motors (especially turbine generators) spinning at 3000/3600 RPM. If you increase the number of poles (wired in parallel as you don't want more phases) by n, the speed should drop by a factor of n. $\endgroup$
    – Nick T
    Commented Oct 7, 2015 at 20:04
  • 3
    $\begingroup$ @NickT You can do it of course, you do it for example in hydro generators which spin at as low as 300rpm (with 10 pole pairs) to create 50Hz. These generators have larger diameters to accommodate all the poles. On the other hand the 3000/3600rpm generators are called turbogenerators, they really spin a this speed. They are long and have a smaller diameter. The stresses limit the maximum diameter of these generators, it is a materials problem. It has to do with the medium driving the turbine, steam is concentrated energy, hydro gets its energy from a large volume. $\endgroup$
    – WalyKu
    Commented Oct 7, 2015 at 20:47
  • 3
    $\begingroup$ @NickT Most modern AC generators do have multiple poles, but they use them to produce three phase power which can be distributed more efficiently. When you say "why can't they just", remember these standards were being developed in the 1890s when there was no "just" for anything to do with electricity. $\endgroup$
    – Schwern
    Commented Oct 7, 2015 at 22:38
  • 4
    $\begingroup$ "The Slow Mo Guys" made a nice video about CD's shattering at high speed. The best footage is available here $\endgroup$
    – Svj0hn
    Commented Oct 8, 2015 at 7:26
27
$\begingroup$

In the end, the choice of a single specific number comes from the necessity to standardize. However, we can make some physical observations to understand why that final choice had to fall in a certain range.

Frequency

Why a standard?

First of all, why do we even need a standard? Can't individual appliances convert the incoming electricity to whatever frequency they want? Well, in principle it's possible, but it's rather difficult. Electromagnetism is fundamentally time invariant and linear; the differential equations we use to describe it Maxwells' equations are such that a system driven by a sinusoidal input at frequency $\omega$ responds only at that same frequency. In order to get out a frequency different from $\omega$ the electromagnetic fields have to interact with something else, notably charged matter. This can come in the form of a mechanical gear box or a nonlinear electrical elements such as transistors. Nonlinear elements such as the transistor can generate harmonics of the input, i.e. frequencies $2 \omega$, $3 \omega$, etc. However, in any case, frequency conversion introduces efficiency loss, cost, and bulkiness to the system.

In summary, because of the time invariance and linearity of electromagnetism, it is considerably more practical to choose a single frequency and stick to it

Light flicker

In a historical note by E. L. Owen (see references), it is noted that the final decision between 50 and 60 Hz was somewhat arbitrary, but based partially on the consideration of light flicker.

During the lecture, while Bibber recounted Steinmecz’s contributions to technical standards, he briefly repeated the story of the frequencies. By his account, “the choice was between 50- and 60-Hz, and both were equally suited to the needs. When all factors were considered, there was no compelling reason to select either frequency. Finally, the decision was made to standardize on 60-Hz as it was felt to be less likely to produce annoying light flicker.”

The consideration of light flicker comes up elsewhere in historical accounts and explains why very low frequencies could not be used. When we drive a pure resistance with an ac current $I(t) = I_0 \cos(\omega t)$, the instantaneous power dissipation is proportional to $I(t)^2$. This signal oscillates in time at a frequency $2\omega$ (remember your trig identities). Therefore, if $\omega$ is lower than around $40 \, \text{Hz}$$^{[a]}$, the power dissipated varies slowly enough that as a visual stimulus you could perceive it. This sets a rough lower limit on the frequency you can use for driving a light source. Note that the arc lamps in use when electrical standards were developed may not have had purely resistive electrical response (see Schwern's answer where cooling on each cycle is mentioned) but the source frequency is always present in the output even in nonlinear and filtered systems.

Reflections / impedance matching

Alternating current signals travelling on a wire obey wave-like behavior. In a rough sense, the higher the frequency the more wavy the signal. A good rule of thumb is that if the length of wires is comparable to or much longer than the wavelength of the signal, then you have to worry about wave-like phenomena such as reflection. The wavelength $\lambda$ of an electrical signal is roughly $$\lambda = c / f$$ where $c$ is the speed of light and $f$ is the frequency. Suppose we'd like to transmit the electricity from an electrical substation to a house and we want to keep the wavelength big enough to prevent reflection physics without having to deal with careful impedance matching. Let's put in a length of $1000 \, \text{m}$ to be conservative. Then we get $$f \leq c / 1000 \, \text{m} = 300 \, \text{kHz} \, .$$

Voltage

We're talking about the voltage inside the building here. Note that power is transmitted at much higher voltage and then stepped down near the end point. The 120 V choice apparently comes from the fact that electricity was originally used for lighting, and the first lamps back in those early days were most efficient at around 110 V. The value 120 V may have been chosen to offset voltage drop in the wires going to the lighting sources.

Further reading

Detailed document by E. L. Owen with references

$[a]$: I'm not an expert in human flicker perception. This number is a rough guess based on personal experience and some literature.

P.S. I consider this answer a work in progress and will add more as I learn more.

$\endgroup$
5
  • $\begingroup$ Same comment as on the other answer -- this addresses the frequency, but not the 230V (and 120V in the US) and so it's only answering half of the question. $\endgroup$
    – tpg2114
    Commented Oct 7, 2015 at 18:37
  • $\begingroup$ @tpg2114 Yup. As I said, it's a work in progress. $\endgroup$
    – DanielSank
    Commented Oct 7, 2015 at 18:44
  • $\begingroup$ Just a gentle reminder! $\endgroup$
    – tpg2114
    Commented Oct 7, 2015 at 18:50
  • $\begingroup$ For a "symmetric" device like an incandescent light, doesn't 40 Hz AC (with negligible DC offset) really mean 80 Hz? 80 Hz sounds fast... $\endgroup$
    – Nick T
    Commented Oct 7, 2015 at 20:09
  • $\begingroup$ @NickT: Yes, see the "2ω" part. $\endgroup$
    – MSalters
    Commented Oct 8, 2015 at 8:37
13
$\begingroup$

The two other answers address the frequency issue. The voltage issue is much simpler.

If the voltage is too high, you run the risk of arcs between conductors. The minimum distance between conductors before an arc appears is proportional to voltage. At 240V, you arc at a distance of a few millimeters in air, depending on humidity. More voltage gets clearly impractical...

If the voltage gets lower, on the other hand, you need more current for a given power. But heating of wires is proportional to current squared: This means one needs thicker wire, with lower resistance. That's cumbersome, expensive and stiff (as an example, 32A rated wire is barely bendable enough for wall corners).

So the chosen 120/240V reflects this balance between arcing concerns (especially around connections) and wire heating.

I also heard that safety dictates highish voltage so that muscle spasms give you a chance to drop whatever you're touching before you get burnt to the core. I don't know to which extent this is true...

$\endgroup$
4
  • $\begingroup$ I have never understood this argument that high voltage is more efficient. You say power dissipation goes as current squared, but it also goes as voltage squared. There is probably a simply explanation if one considers the circuit theory properly, but I've never seen this explained in a convincing way. $\endgroup$
    – DanielSank
    Commented Oct 7, 2015 at 19:22
  • 9
    $\begingroup$ @DanielSank: If you want a device with a specific power rating, say 1000W, you need 8.3 Amps@120V, or 4.34 Amps@230V. corresponding to 14.45 resp. 52 Ohms resistance in your device. Now if your wires have 0.1 Ohms (much lower than your device, low enough to not change the current significantly), they will dissipate 0.1*8.3^2=6.9W in the first case, and 0.1*4.34^2=1.9W in the second case. Which means you lose 4 times as much with 120V, and your wires heat 4 times as much. $\endgroup$ Commented Oct 7, 2015 at 20:07
  • 1
    $\begingroup$ @DanielSank: The key point is to distinguish between "useful voltage" and "unwanted but unavoidable voltage drop". Useful power to the load is the product of the useful voltage and in-phase current. Waste power is proportional to the product of total current and its associated in-phase unwanted but unavoidable voltage drop. In general, current is chosen to given the desired amount of useful power, and the unwanted voltage drop is proportional to the current, thus making waste proportional to current squared. $\endgroup$
    – supercat
    Commented Oct 7, 2015 at 21:31
  • $\begingroup$ @GuntramBlohm Ah yes, of course. $\endgroup$
    – DanielSank
    Commented Oct 7, 2015 at 23:15
2
$\begingroup$

The disadvantage of having too low a frequency is that the mains transformers become very large.

However there have been lower frequency standards (25 Hz, 15, ect.) These are used by trains (mostly legacy systems).

$\endgroup$
7
  • $\begingroup$ Could you add some references and explanation why transformers become larger as frequency drops? Something to do with the numbers of windings? $\endgroup$
    – Schwern
    Commented Oct 8, 2015 at 0:43
  • 4
    $\begingroup$ Airplanes use 400 Hz for this reason; 50 Hz transformers are too heavy. $\endgroup$
    – MSalters
    Commented Oct 8, 2015 at 8:38
  • 1
    $\begingroup$ @Schwern: Somewhat simplified, for a fixed size transformer the energy converted per cycle is a constant. More cycles per second is more energy. Keeping cycles constant, OTOH, we see that energy converted scales with size. Combining the two, we see that at lower frequencies we need to increase the size to keep the power constant. $\endgroup$
    – MSalters
    Commented Oct 8, 2015 at 8:41
  • 1
    $\begingroup$ @Schwern First what happens at 0 Hz? A short, therefore infinite magnetic flux. Now think of what happens as you lower the frequency, you'll have to approach this limit all else being equal. Therefore the magnetic flux becomes larger in the core and so to avoid saturating the core you have to make it larger. $\endgroup$
    – user16035
    Commented Oct 9, 2015 at 19:29
  • 1
    $\begingroup$ Nowadays we see many switching-mode power supplies. Internally they create a high frequency, allowing them to transform the voltage with much lighter transformers. They are much smaller, much lighter, more efficient than transformers and produce a stabilized output voltage. $\endgroup$ Commented Oct 10, 2015 at 19:59
1
$\begingroup$

Practical reasons include the skin effect (you do not want your frequency to exceed at most a few kHz by much unless you are willing to use something akin to Litz wire to transfer large currents) and the size of magnetic cores for transformers, which must be able to magnetically store more than the maximum energy to be transmitted in each cycle, such that their volume grows with the cycle period. However, these physical constraints do not define a sharp optimum; as such, 10 Hz or 500 Hz would be just as reasonable and similar values are used in practice even today: Modern jet planes have 400 Hz power supplies whilst, at least in Germany, the power supply for electric trains is standardized at 16 2/3 Hz.

There is obviously a similar trade-off between voltage and current, but at least as long as your chosen frequency allows you to compensate a lower voltage with thicker wires and a higher voltage with thicker isolation, you might argue that this is more of an economic or safety trade-off. After all, for long distances, we transform to achieve a better compromise (and must use AC rather than DC to always be able to do that, even with purely passive, historically old techniques). Hence I suspect, without actually knowing, that historic reasons, such as the maximum practical voltage for which light bulbs could be made during the time of standardization, or perhaps accompanying ideas about what might still not be too dangerous for factories and homes, play a role.

$\endgroup$
0
$\begingroup$

It seems like 60 Hz may have been selected instead of 55 or 75 simply because there are 60 seconds in a minute and so 60 cycles per second seemed a comfortable number.

During the early days of distributed power transmission the frequencies and voltages would have been all over the place. The limits of what was safe and convenient would have been developed through practical experience.

The materials used for transformers would have preferred low frequencies. The mass of transformers would have preferred high frequencies. The range of 50-60 was the sweet spot and 50 and 60 are both 'round' numbers that divide well for timing purposes.

The voltages would have standardised somewhat with the equipment supplied, light bulbs, motors and such would have been sold to match a local supply and vendor voltage ranges would have promoted generation voltage optimisation.

$\endgroup$
1
  • 2
    $\begingroup$ The first paragraph is speculation and doesn't explain 50Hz or 45 or 400 or the other frequencies being played with in the 19th century. The second paragraph needs to define why certain frequencies and voltages are "safe and convenient". Some citation on the effect of frequencies on transformer material and mass would help. The final paragraph about voltages does not address that electrical items at the time came in wildly varying voltages, why did they converge to the 110/240 range rather than 10-100 or 200-1000? All of this needs citations or equations. $\endgroup$
    – Schwern
    Commented Oct 9, 2015 at 20:02

Not the answer you're looking for? Browse other questions tagged or ask your own question.