Recently I bought a device to measure the wattage from devices. To test the apparatus I measured 6 watts from an IKEA LED light bulb which was supposed to produce 6 watt usage and a few laptop adapters. The apparatus seems to work.
Then I tested my work in progress home server. It surprised me the server demanded 25 watts while it was turned off. So I searched some and it appeared normal for desktop machines to demand some wattage for a bunch of functions like waking up the desktop through LAN. Though 25 wattage seems an awful lot.
Then I turned off the power button on the power supply so I could no longer turn on the server. I would suspect 0 watts to be demanded, instead, a steady 13 watts is demanded by the desktop.
I tried all sorts of BIOS shenanigans to be found on the internet to reduce wattage while the server is off. I can't seem to get below the 25 watts.
What is going on? So far the only idea as to what could be the problem is a faulty power supply which is making an internal short circuit or something.
What can I do to make wattage usage acceptable (~5 watts) when I turn off the server?
Some pictures as to my current setup and bios settings:
The power supply is a Cooler master RS-520-ASAA-A1.