I have an AVR and my PC plugged into. I Live in Europe and the voltage in the mains is +-240V. The AVR has a LCD display showing the current input (mains) is 237 and the output voltage 210-217V. My question is, because i'm not familiar with electricity too much, what is the deal with that, is it something wrong with my AVR or they all work like that.
And while i was searching for an answer for this topic I've found some other information online like: "...So in general and within reason, higher voltages give you higher efficiencies." And: "Here's a quick explanation of why power supplies get less efficient as the supply voltage decreases. All electronic components (transistors, transformers, even the traces on the printed circuit board) have some sort of equivalent resistance..."
Anyway am I putting more stress on the PSU by using AVR or it is normal. Thanks