Skip to main content
Used the wrong current value (10A instead of 3A and 4A) in the last two power calculations.
Source Link
Fred Hamilton
  • 2.3k
  • 5
  • 31
  • 41

Bonus example: Here's a quick explanation of why power supplies get less efficient as the supply voltage decreases. All electronic components (transistors, transformers, even the traces on the printed circuit board) have some sort of equivalent resistance. When a power transistor is switched "on", it has an "on resistance", let's say 0.05ohms. So when 3A of current flows through that transistor, it sees 3A * 0.05ohms = 0.15V across its leads. That 0.15V * 10A3A = 10.5W45W of power that is now being dissipated in that transistor. That's waste power - it's heat in the power supply, not power to the load. That's our 300W scenario, 120V scenario.

In the 90V brownout 300W scenario, the transistor has the same 0.05ohm on resistance, but now there's 4A of current going through it, so it drops 4A * 0.05ohms = 0.2V across its leads. That 0.2V * 10A4A = 2W0.8W of power that is now being dissipated in that transistor. So each device (and there are a lot of them) in the power supply that has an on resistance/voltage drop across it will generate more heat (wasted power) when the supply voltage drops. So in general and within reason, higher voltages give you higher efficiencies.

Bonus example: Here's a quick explanation of why power supplies get less efficient as the supply voltage decreases. All electronic components (transistors, transformers, even the traces on the printed circuit board) have some sort of equivalent resistance. When a power transistor is switched "on", it has an "on resistance", let's say 0.05ohms. So when 3A of current flows through that transistor, it sees 3A * 0.05ohms = 0.15V across its leads. That 0.15V * 10A = 1.5W of power that is now being dissipated in that transistor. That's waste power - it's heat in the power supply, not power to the load. That's our 300W scenario, 120V scenario.

In the 90V brownout 300W scenario, the transistor has the same 0.05ohm on resistance, but now there's 4A of current going through it, so it drops 4A * 0.05ohms = 0.2V across its leads. That 0.2V * 10A = 2W of power that is now being dissipated in that transistor. So each device (and there are a lot of them) in the power supply that has an on resistance/voltage drop across it will generate more heat (wasted power) when the supply voltage drops. So in general and within reason, higher voltages give you higher efficiencies.

Bonus example: Here's a quick explanation of why power supplies get less efficient as the supply voltage decreases. All electronic components (transistors, transformers, even the traces on the printed circuit board) have some sort of equivalent resistance. When a power transistor is switched "on", it has an "on resistance", let's say 0.05ohms. So when 3A of current flows through that transistor, it sees 3A * 0.05ohms = 0.15V across its leads. That 0.15V * 3A = 0.45W of power that is now being dissipated in that transistor. That's waste power - it's heat in the power supply, not power to the load. That's our 300W scenario, 120V scenario.

In the 90V brownout 300W scenario, the transistor has the same 0.05ohm on resistance, but now there's 4A of current going through it, so it drops 4A * 0.05ohms = 0.2V across its leads. That 0.2V * 4A = 0.8W of power that is now being dissipated in that transistor. So each device (and there are a lot of them) in the power supply that has an on resistance/voltage drop across it will generate more heat (wasted power) when the supply voltage drops. So in general and within reason, higher voltages give you higher efficiencies.

Source Link
Fred Hamilton
  • 2.3k
  • 5
  • 31
  • 41

PIE. P = IE. Power = Current times Voltage. So if the voltage is lower in a brownout, a power supply has to pull more current from the mains to maintain the same power. So while the voltage stress is indeed lower during a brownout, the current stress to the power supply increases to compensate.

Here's the short answer: In a brownout, power supplies need to draw more current to compensate for the lower supply voltage, which is very stressful for transistors, wires, diodes, etc. They also become less efficient, which makes them draw even more current, aggravating the problem.

Here's the long answer: Most PCs (if not all) use switching power supplies. If all the elements of the supply (the transistors, transformers, capacitors, diodes, etc.) were completely ideal, a supply could take any input voltage and produce the desired power at the desired voltage (as long as there was enough current at the input to maintain P=IE).

But those elements are all far from ideal, so all real-world power supplies are designed to operate inside a certain range, say 80 to 240V. Even inside the range they are designed for, the efficiency (the percentage of power at the output of the supply compared to the power needed at the input) tends to fall off as the input voltage gets lower. Anandtech has a good example graph. The X-axis is the power at the output of the supply (the load) and the Y-axis is the efficiency. So this supply is most efficient at around 300W.

For a 120V input, it's about 85% efficient, so it draws about 300W/0.85 = 353W from the wall to get you 300W at the output. The "missing" 53W is dissipated in the power supply circuitry (that's why your PCs have fans - it's like your power supply has a 50W bulb in a little box and it needs to get the heat out). Since P=IE, we can calculate the current it needs from the wall plug to produce 300W output from 120V: I = P/E = 353W/120V = 2.9A. (I'm ignoring power factor to keep this explanation simple.)

For a 230V input, the efficiency is 87%, so it only pulls 344W from the wall, which is nice. Because the voltage is so much higher, the current draw is much lower: 344W/230V = 1.5A.

But in a 90V brownout condition, the efficiency is even worse than at 120V: 83.5%. So now the supply is pulling 300W/0.835 = 359W from the wall. And it's pulling even more current: 359W/90V = 4A!

Now that probably wouldn't stress this power supply much since it's rated at 650W. So let's have a quick look at what happens at 650W. For 120V, it's 82% efficient -> 793W and 6.6A from the wall. But the efficiency is even worse at high loads, so for 90V we see 78.5% efficiency, which means 828W and 9.2A! Even if the efficiency stayed at 78.5%, if the brownout went to 80V it would need to pull 10.3A. That's a lot of current; things start to melt if they aren't designed for that sort of current.

So that's why brownouts are bad for power supplies. They need to draw more current to compensate for the lower supply voltage, which is very stressful for transistors, wires, diodes, etc. They also become less efficient, which makes them draw even more current, aggravating the problem.

Bonus example: Here's a quick explanation of why power supplies get less efficient as the supply voltage decreases. All electronic components (transistors, transformers, even the traces on the printed circuit board) have some sort of equivalent resistance. When a power transistor is switched "on", it has an "on resistance", let's say 0.05ohms. So when 3A of current flows through that transistor, it sees 3A * 0.05ohms = 0.15V across its leads. That 0.15V * 10A = 1.5W of power that is now being dissipated in that transistor. That's waste power - it's heat in the power supply, not power to the load. That's our 300W scenario, 120V scenario.

In the 90V brownout 300W scenario, the transistor has the same 0.05ohm on resistance, but now there's 4A of current going through it, so it drops 4A * 0.05ohms = 0.2V across its leads. That 0.2V * 10A = 2W of power that is now being dissipated in that transistor. So each device (and there are a lot of them) in the power supply that has an on resistance/voltage drop across it will generate more heat (wasted power) when the supply voltage drops. So in general and within reason, higher voltages give you higher efficiencies.