Skip to main content
Became Hot Network Question
Fixed typo for 30mA was 0.003mA
Source Link
Spehro Pefhany
  • 407.1k
  • 22
  • 344
  • 917

I couldn't find the resistance values for an LED so I tried to measure it and ended up much more confused than before:

I have a plain (red) LED and initially I used a multimeter to measure the resistance but it didn't show any values. I tried 2 multimeters with the same results; also, I tried to connect it the other way around with no improvement. (I do know that the LED isn't broken, it does emit light in a circuit).

So I went another route and added it to a (series) circuit to measure it from there. I have a 3V battery and a 100 ohm resistor, if I measure the current with just those 2 I get 30 mA, which checks out: $$ I = \frac{V}{R} = \frac{3} {100} = 0.003mA $$$$ I = \frac{V}{R} = \frac{3} {100} = 0.03A $$

But when I add the LED to this circuit (also in series) I measure a current of about 11 mA. Meaning $$ R = \frac{V}{I} = \frac{3} {0.011} = 273\Omega $$ That would be the total resistance and since I know that the resistor has 100 ohm the LED should have a resistance of 173 ohm. This seems like a lot, from reading online I usually see that the default value is 75.

So to double check I swapped the 100 resistor with a 200 ohm one but that confused me a lot: If I measure the current without the LED I get 15mA which aligns with the calculations $$ I = \frac{V}{R} = \frac{3}{200} = 0.015mA$$$$ I = \frac{V}{R} = \frac{3}{200} = 0.015A$$

but when I add the LED the current drops to about 6mA. Calculating the resistance from there I get $$ R = \frac{V}{I} = \frac{3}{0.006} = 500\Omega$$ So the resistance of the same LED would be 300.

I also tried another approach by measuring the Voltage drop and the results got even more confusing: The voltage dropped by 1.9 for the LED and 1.1 for the resistor; this applies to both the 100 and 200 ohm resistor; I had the same numbers for the 2 resistors which would get me very different numbers using Ohm's law to calculate the resistance.

I know I am doing something wrong but I don't get where my mistake is. Am I messing up the measuring with the multimeter, or am I misunderstanding the formulas, or do I get something more fundamental wrong?

This is how I measured the current with a 100 ohm resistor 100ohm measure Thanks so much to anyone reading all of this!

I couldn't find the resistance values for an LED so I tried to measure it and ended up much more confused than before:

I have a plain (red) LED and initially I used a multimeter to measure the resistance but it didn't show any values. I tried 2 multimeters with the same results; also, I tried to connect it the other way around with no improvement. (I do know that the LED isn't broken, it does emit light in a circuit).

So I went another route and added it to a (series) circuit to measure it from there. I have a 3V battery and a 100 ohm resistor, if I measure the current with just those 2 I get 30 mA, which checks out: $$ I = \frac{V}{R} = \frac{3} {100} = 0.003mA $$

But when I add the LED to this circuit (also in series) I measure a current of about 11 mA. Meaning $$ R = \frac{V}{I} = \frac{3} {0.011} = 273\Omega $$ That would be the total resistance and since I know that the resistor has 100 ohm the LED should have a resistance of 173 ohm. This seems like a lot, from reading online I usually see that the default value is 75.

So to double check I swapped the 100 resistor with a 200 ohm one but that confused me a lot: If I measure the current without the LED I get 15mA which aligns with the calculations $$ I = \frac{V}{R} = \frac{3}{200} = 0.015mA$$

but when I add the LED the current drops to about 6mA. Calculating the resistance from there I get $$ R = \frac{V}{I} = \frac{3}{0.006} = 500\Omega$$ So the resistance of the same LED would be 300.

I also tried another approach by measuring the Voltage drop and the results got even more confusing: The voltage dropped by 1.9 for the LED and 1.1 for the resistor; this applies to both the 100 and 200 ohm resistor; I had the same numbers for the 2 resistors which would get me very different numbers using Ohm's law to calculate the resistance.

I know I am doing something wrong but I don't get where my mistake is. Am I messing up the measuring with the multimeter, or am I misunderstanding the formulas, or do I get something more fundamental wrong?

This is how I measured the current with a 100 ohm resistor 100ohm measure Thanks so much to anyone reading all of this!

I couldn't find the resistance values for an LED so I tried to measure it and ended up much more confused than before:

I have a plain (red) LED and initially I used a multimeter to measure the resistance but it didn't show any values. I tried 2 multimeters with the same results; also, I tried to connect it the other way around with no improvement. (I do know that the LED isn't broken, it does emit light in a circuit).

So I went another route and added it to a (series) circuit to measure it from there. I have a 3V battery and a 100 ohm resistor, if I measure the current with just those 2 I get 30 mA, which checks out: $$ I = \frac{V}{R} = \frac{3} {100} = 0.03A $$

But when I add the LED to this circuit (also in series) I measure a current of about 11 mA. Meaning $$ R = \frac{V}{I} = \frac{3} {0.011} = 273\Omega $$ That would be the total resistance and since I know that the resistor has 100 ohm the LED should have a resistance of 173 ohm. This seems like a lot, from reading online I usually see that the default value is 75.

So to double check I swapped the 100 resistor with a 200 ohm one but that confused me a lot: If I measure the current without the LED I get 15mA which aligns with the calculations $$ I = \frac{V}{R} = \frac{3}{200} = 0.015A$$

but when I add the LED the current drops to about 6mA. Calculating the resistance from there I get $$ R = \frac{V}{I} = \frac{3}{0.006} = 500\Omega$$ So the resistance of the same LED would be 300.

I also tried another approach by measuring the Voltage drop and the results got even more confusing: The voltage dropped by 1.9 for the LED and 1.1 for the resistor; this applies to both the 100 and 200 ohm resistor; I had the same numbers for the 2 resistors which would get me very different numbers using Ohm's law to calculate the resistance.

I know I am doing something wrong but I don't get where my mistake is. Am I messing up the measuring with the multimeter, or am I misunderstanding the formulas, or do I get something more fundamental wrong?

This is how I measured the current with a 100 ohm resistor 100ohm measure Thanks so much to anyone reading all of this!

Source Link

Measuring the resistance of an LED -> getting conflicting values

I couldn't find the resistance values for an LED so I tried to measure it and ended up much more confused than before:

I have a plain (red) LED and initially I used a multimeter to measure the resistance but it didn't show any values. I tried 2 multimeters with the same results; also, I tried to connect it the other way around with no improvement. (I do know that the LED isn't broken, it does emit light in a circuit).

So I went another route and added it to a (series) circuit to measure it from there. I have a 3V battery and a 100 ohm resistor, if I measure the current with just those 2 I get 30 mA, which checks out: $$ I = \frac{V}{R} = \frac{3} {100} = 0.003mA $$

But when I add the LED to this circuit (also in series) I measure a current of about 11 mA. Meaning $$ R = \frac{V}{I} = \frac{3} {0.011} = 273\Omega $$ That would be the total resistance and since I know that the resistor has 100 ohm the LED should have a resistance of 173 ohm. This seems like a lot, from reading online I usually see that the default value is 75.

So to double check I swapped the 100 resistor with a 200 ohm one but that confused me a lot: If I measure the current without the LED I get 15mA which aligns with the calculations $$ I = \frac{V}{R} = \frac{3}{200} = 0.015mA$$

but when I add the LED the current drops to about 6mA. Calculating the resistance from there I get $$ R = \frac{V}{I} = \frac{3}{0.006} = 500\Omega$$ So the resistance of the same LED would be 300.

I also tried another approach by measuring the Voltage drop and the results got even more confusing: The voltage dropped by 1.9 for the LED and 1.1 for the resistor; this applies to both the 100 and 200 ohm resistor; I had the same numbers for the 2 resistors which would get me very different numbers using Ohm's law to calculate the resistance.

I know I am doing something wrong but I don't get where my mistake is. Am I messing up the measuring with the multimeter, or am I misunderstanding the formulas, or do I get something more fundamental wrong?

This is how I measured the current with a 100 ohm resistor 100ohm measure Thanks so much to anyone reading all of this!