9
\$\begingroup\$

I used my simple digital multimeter to measure the real output of several small PSU's I have got at home. They all should give just direct current and show some values in VDC mode, but the multimeter also shows nonzero values when positioned in VAC mode. But not on all power supplies. Why is this?

For example a 12 V PSU shows 12 VDC, but it also shows 25 VAC, but other 12 V PSU shows just 0.00 on VAC with just slowly alternating +/- sign. Is this some sign of imperfect PSU, a bad output capacitor, or is it that a SMPS always shows something at VAC due to its noise, while linear PSU not showing anything at VAC....?

Update:

I did more testes with various PSU's. I get around 215% of DC voltage in VAC mode on my multimeter. This applies to all PSU's I've tested except one which shows 0.00 VAC - maybe that was caused by switched polarity. It shows 0.00 when I switch probes.

I also tested wall outlet - it shows nice 225 VAC, so the multimeter seems to be working correctly when there is a real AC.

\$\endgroup\$

5 Answers 5

5
\$\begingroup\$

The problem is the multimeter is a normal inexpensive meter. It internally measures the AC by rectifying (uses a diode or bridge) the signal to DC, and them measuring the DC. It then applies a "fudged" calibration factor to turn the DC number back to AC.

You can see this by taking a battery, which you know is pure, constant DC, like an AA cell, and measure it in the AC range. It will show an "AC voltage" that is higher than the DC voltage. The problem occurs, when the signal is not pure AC. In that case, it "thinks" it is rectifying an AC signal (usually a sine wave) and getting the DC equivalent ... which it reports back with an adjusting "fudge factor" as AC. But of course, the signal really was DC (or a mix of DC and AC) and the DC part should not have been adjusted and reported as AC.

So what gives. Nothing, it is just a short cut. Inexpensive multimeters do not correctly measure signals with mixed AC and DC, when on the AC setting. They do measure DC correctly, even for mixed signals.

If you need to make this measurement buy an more expensive multimeter that measures true RMS AC. Or, if you are lucky and have access to an oscilloscope, look at the wave form on the oscilloscope. You can google this problem and get a more detailed explanation of this normal limitation of inexpensive meters. Cheers.

\$\endgroup\$
3
\$\begingroup\$

A DC power supply that shows a greater AC voltage than the rated DC is probably not very well-designed, or due for replacement.

That apart, many SMPS designs will show some AC ripple (that your multimeter reads as AC signal), and in some cases this ripple is higher when the supply is not loaded, or lightly loaded.

\$\endgroup\$
2
\$\begingroup\$

Reading 25 VAC on the output of a 12 VDC power supply is definitely wrong. Unfortunately, from what you tell us it's hard to determine what exactly is wrong. Perhaps that supply is just broken.

The best thing to do would be to look at its output voltage on a scope. Then you can see for sure what is going on. There are other ways to get some idea about a AC signal. For example, put a speaker in series with a 1 kΩ resistor on the supply output. If it really has such large AC and it is in the audible range, then you'll definitely hear it. If it is really putting out 25 VAC RMS (which I have a hard time believing), then a 1 kΩ resistor will dissipate over 600 mW, which will make a ordinary "1/4 W" resistor get very hot quickly. If the voltage is really that large, you'll hear something with a 10 kΩ resistor in series with the speaker too.

You can also try putting some capacitance on the supply output and see what that does to the meter reading. To be safe, get capacitor rated for 50 V at least. You probably need 10s of µF before anything much happens. If this supply is truly broken, it could blow up the cap though. Again, a scope would tell us what's really going on.

Added:

I just had another thought as to what is going on. 25 VAC from a 12 VDC supplies seems a bit unbelievable, even for a busted one. I'm guessing your meter isn't really connectect accross the supply output properly. Does this supply possibly have 3 terminals? I have seen some where it's a bit confusing which two are actually the supply output and that the third is the wall plug ground. There is usually a strap you can clamp between the wall ground and one of the two supply ends. When this is not strapped and you put the meter between either output and the wall ground, you can get exactly what you are seeing. There will be some capacitance in the supply to the hot side of the AC line, and this will add a common mode signal onto the supply output. It is high impedance, so not really a problem. If you put a 10 kΩ resistor accross the meter when reading the AC voltage and it drops down a lot, then that's what's happening.

Added 2:

From your latest experiments, it sounds like the DC blocking cap isn't in series when you are doing the AC tests. Look closely at your meter. Does it only have two places to plug the leads into, or are there two or more jacks for the red lead depending on what you are trying to measure? When taking AC measurements, not only make sure the dial is on AC volts, but also that the leads are plugged into the correct places for AC voltage measurement.

\$\endgroup\$
2
  • \$\begingroup\$ If I understand it correctly, I need a bipolar capacitor with tens micro farad capacity and high voltage rating. That's definitly out of my scope. I did more tests - see update in my question - it seems to me that it's all caused by the multimeter itself, not a problem with my PSU's. \$\endgroup\$
    – Al Kepp
    Commented Dec 29, 2012 at 20:09
  • 1
    \$\begingroup\$ I wonder if the meter might be measuring AC by half-wave rectifying the input signal and measuring the linear (non-RMS) average voltage, and scaling the result? Feeding 1 volt RMS into such a system would yield about .46 volts after filtering, so if the meter assumes AC voltage is 2.1x the measured value that could explain the result. \$\endgroup\$
    – supercat
    Commented Oct 31, 2013 at 20:40
2
\$\begingroup\$

Cheap multimeters usually can't accurately measure an AC signal with a large DC component. A two times multiple of the DC voltage is a common erroneous result. It has nothing to do with your power supply, but with the circuit the multimeter uses to measure AC voltages. See this question and this post on adafruit which also had this problem.

I don't know exactly what causes the 2x factor, but maybe someone with more knowledge of the methods used to measure AC voltages in entry-level meters could help.

\$\endgroup\$
2
\$\begingroup\$

I have the same problem and believe it to be cheap or no blocking capacitors in the ac function of your multimeter. If you look at a voltage doubler circuit it is a full wave full bridge rectifier with caps. Since every DC supply has a full bridge rectifier its acting as a doubler. Oddly enough if you reverse the leads on that AC range you'll get zero volts. I like the comment above(I will try) that is, to place a proper voltage cap. across the output of your DC supply and then remeasure. And if ur really feeling lucky maybe a capacitor pie filter on ur DC supply output.

\$\endgroup\$

Not the answer you're looking for? Browse other questions tagged or ask your own question.