6
\$\begingroup\$

I have an old Fluke 79-III and I'm trying to check the resistance of some resistors which should be 0.22 Ohms. Unfortunately, they are all reading 0.6 Ohms and so I'm wondering how accurate I should be expecting the main resistance function of my meter to be.

The specs from the Fluke 79-II user manual (p39), which agrees with the Fluke 79-III instruction sheet (p11) gives us:

| Function | Range      | Resolution | Accuracy      | Burden Voltage (Typical) |
| Ohm      | 400.0 Ohm  | 0.1 Ohm    | ±(0.4%+2)     | Not applicable           |
|          | 4.000 kOhm | 0.001 kOhm | ±(0.4%+1)     |                          |
| 40 Ohm   | 40 Ohm*    | 0.01 Ohm   | 5% Typical*** | Not applicable           |
| *   In 40 Ohm and 40mV ranges, thermals may introduce additional errors. To   |
|     maximize accuracy, keep both probe tips at similar temperatures.          |
| *** Accuracy applies after lead resistance compensation.                      |

I believe that I understand the resolution to mean that while I shouldn't expect 0.22 ohm, I should at least expect the meter to read 0.2 or 0.3 ohms.

Edit 1) I don't understand the accuracy rating though. Does the +2 mean that I should expect it to read high by up to two in the least significant digit? Is the 0.4% accuracy 0.4% of the 400 ohm range (i.e. 1.6 ohms), or 0.4% of the current reading?

Looking closer, the spec from the Fluke 79-II user manual confirms that:

Accuracy specifications are given as: ± ([% of reading] + [number of least significant digits])

So B Pete's answer of ±0.2 ohms looks good.

Edit 2) Measuring the resistance of the leads by shorting them gives 0.3 ohms, so the 0.6 ohm I originally measured is well within the accuracy envelope of 0.22±0.2 ohm + 0.3±0.2 ohm.

Edit 3) Also, I did originally try the meters Lead resistance compensation (40 ohm) mode (instruction sheet p6) and measured 0.15 ohm, so I discounted it as also being inaccurate.

With only 5% accuracy, this should be correct to ±0.011 ohms.

Checking again now however, I see that when I do a Lead resistance compensation, the longer I keep the range button pressed it, the more stable it becomes, converging on a reading of 0.05 ohms. Measuring the resistor, it now shows 0.17, sum these and (ta-da) we get 0.22 ohms.

Not bad for a 10+ year old meter that hasn't been calibrated in at least 6 years. *8')


Ultimately, should I expect my multimeter to get close to being able to measure 0.22 Ohm, or should I not expect much accuracy below a few ohms?

\$\endgroup\$

4 Answers 4

5
\$\begingroup\$

The Fluke meters I am familiar with are specified as +/-(given percent of measurement + given number of least significant digits). For the given info, using the 400 ohm range, this yields +/-(0.004 * 0.22 + 2 * 0.1) = +/-0.20088 or approximately +/-0.2 ohms, if the meter is accurately calibrated.

When measuring low resistance values, especially at the sub 1 ohm level, you will need to consider the impedance of your probes and the connections to the meter and the resistor. If you make a resistance measurement with your probes shorted together, you can get an idea of the resistance of you probe connections.

To more accurately measure sub 1 ohm resistors without using a Wheatstone bridge or other indirect measurement methods, I would recommend using a bench DVM with a 4-wire (Kelvin) resistance measurement capability. This will remove the resistance of your probes from the measurement.

\$\endgroup\$
0
6
\$\begingroup\$

To get a good resistance reading, it's necessary to connect a precision current source to a resistor using one set of leads, and connect a precision high-impedance voltmeter using another set of leads. If this is done, one can use R=E/I to compute the resistance. For example, if one is using a current of 1mA, one will read one mV/ohm. With 10mA, one will read 10mV/ohm. With 100mA, 100mV/ohm, or with 1A, 1V/ohm (note that if one is trying to measure a resistance of e.g. 0.1 ohm, putting through 1A will only dissipate 0.1W).

Using separate leads for the current source and the voltage measurement will avoid measurement errors that would otherwise result from imperfect connections, and will allow measurements precise to 0.01 ohms or better, even if the connections have a resistance of a noticeable fraction of an ohm (with the caveat that if one is pushing 1A through a connection and it has too much resistance, the connection itself may get hot).

\$\endgroup\$
5
\$\begingroup\$

Summary

  • A method is given to allow use of a voltmeter and a known higher value resistor. This will give better results than available ohms range will.

  • Probe resistance and contact resistance need to be allowed for if using meter.

  • Meter errors on 400 ohm range are in excess of value to be measured but a 'rough idea' can be established.


What Chris says, plus:

  • Rated accuracy is with respect to full scale :-(

    = about 1.5 ohms on 400 ohms notional

    AND +/- 2 digits LSD flicker as well.

  • Twist plugs to and fro in sockets to make good contact.

  • Short lead tips together - rub against each other and note lowest reading.

  • THEN measure component - rub probe tips against component or lowest reading. This - probe tip shortest version should give ~= OK result.

BUT at 0.22 ohms you want a better method. Chris's series current method is good. If you don't have a variable current supply use variable voltage + larger resistor or even a battery and resistor.

eg to get 0.22 ohms to give 0.22 Volt say you need 1 Amp - probably too high. To get 22 mV you need 100 mA. To get 2.2 mV (may be OK if you have a 200 mV range) you need 10 mA. A 9V battery and a 1K resistor gives 9 mA.

One method using voltmeter and known larger value resistor.

This works by having same current in a known large resistor and the target resistor. The voltage drops are proportional to the resistances. The example below uses a current of ABOUT 10 mA for a resolution of 10 mV per ohm. Using a 100 R resistor in place of the 1K below gives a resolution of 100 mV/ohm. This allows an eg nominally 9V battery and a known resistor to be used. Exact battery voltage does not matter as long as it does not "droop" (change) during measurement. Accuracy of measurement depends on voltmeter accuracy and resistor accuracy. Be aware of the possible effects of resistor heating. At 1k and 9V power = V^2/R = 0.081 W. At 100R and 9V power = 0.81 Watt! - change of resistance with heating may be significant depending on the resistor used.

  • Measure 1k resistor as accurately as you can = R_1k. Say you get 975 ohms for examples sake.

  • Place 1k + 0.22R in series with 9V battery.

  • Measure voltage across 1k - say 8.85 V as an example.

  • Measure V across 0.22R - say 1.7 mV as an example.

Then R_0.22 / R_1k = V_0.22 / V_1k

So R_0.22 = R1k x V_0.22 / v1k

Here R_0.22 = 975 x 0.0017 / 8.85 = 0.187 ohms.

\$\endgroup\$
2
  • \$\begingroup\$ At least in this case: (support.fluke.com/find-sales/Download/Asset/…) Fluke specifies the accurracy is with respect to the meter reading rather than full scale. \$\endgroup\$
    – B Pete
    Commented Aug 9, 2011 at 3:22
  • \$\begingroup\$ Using this method: (R_0.22 = R_1k x V_0.22 / V_1k) in every case. I gave examples of battery voltage, large resistor value and measurements so a worked example could be done. As I can't read your values from here you will have to measure them for yourself :-). This will allow you to measure the target resistor with your present meter to tolerably good [tm] accuracy. \$\endgroup\$
    – Russell McMahon
    Commented Aug 9, 2011 at 3:26
4
\$\begingroup\$

It's not just your meter, you have probes involved as well. These not only have resistance, but also mean that your circuit is completed through two locations where you are holding likely somewhat oxidized metal surfaces in contact.

Analog meters let you zero the scale with the probes shorted, common DMM's don't seem to let you do that.

Perhaps you could solder one of the resistors between two banana plugs and directly connect it to the DMM.

You could also try either constructing a Wheatstone bridge either using larger resistors on the reference side or perhaps a length of wire of known resistance per unit length and use your meter to measure the imbalance current. Or you could try driving a substantial current through the test resistor (perhaps with a voltage source and a known resistor in series) and measuring the voltage drop across it.

\$\endgroup\$
2
  • \$\begingroup\$ Thanks, I tinned the leads and tried extending the leads down into the fluke (it was a nice chunky power resistor with long leads), but the resistance measured the same. \$\endgroup\$
    – Mark Booth
    Commented Aug 9, 2011 at 9:56
  • \$\begingroup\$ Re "Analog meters let you zero the scale with the probes shorted, common DMM's don't seem to let you do that.": The 121GW has it, page 51: "If you are measuring something relative to another measurement or need to calibrate any non-zero error out you can press the REL button to zero, the currently displayed value.". But not the Fluke 175 (except perhaps for some very involved calibration procedure). \$\endgroup\$ Commented Apr 19, 2019 at 21:00

Not the answer you're looking for? Browse other questions tagged or ask your own question.