0
\$\begingroup\$

I was trying to understand the usage for the word "current limiting resistor." Taking an example of a simple LED with a resistor in series, the way I understand this is we are trying to limit the amount of current entering into the LED.

I'll start with two situation.

Situation 1. LED connected to a bench power supply. The power supply is tuned at 3V at current output set to 2A. If I connect an LED to this power supply where Vf for led is 3V the LED draws the rated current and doesn't consume 2A.

Situation 2. The power supply is at 5V and the max output current is set to 2A. Connecting an LED directly would burn because the Vs is more than Vf hence more current will flow through the LED burning it. Whereas as soon as I add resistor in series we are dropping the voltage and hence by that limiting the current.

So with this understanding can we say that the current limiting resistor is not only current limiting but voltage dropping resistor? Therefore we can say that the resistor is primarily there to reduce the voltage fed to the LED and the current required by the circuit is the amount of total resistance i.e Series resistance and normal operation of LED resistance.

\$\endgroup\$
2
  • \$\begingroup\$ that is exactly how a constant current power supply would keep the output current at a constant value, by adjusting the output voltage \$\endgroup\$
    – jsotola
    Commented Jan 18, 2021 at 21:11
  • \$\begingroup\$ It really doesn't matter whether you call the resistor a "voltage dropping" or a "current limiting" device. A resistor is going to do what a resistor does, which is follow Ohm's Law. \$\endgroup\$ Commented Jan 18, 2021 at 21:42

3 Answers 3

1
\$\begingroup\$

Two things. First, Vf is not a static number. Vf changes with If. Vf changes with temperature. Vf varies from one LED to the next, even if you do your comparison at the same temperature and same If.

What makes this particularly troublesome is that a small change in Vf can cause a large change in If. So if you are building 10,000 devices, and trying to drive an LED with a specific voltage, some of them will get too much current and fail in seconds or minutes or hours or days. Some will get too little current and be too dim, and maybe, some might actually work OK for a long time (or maybe not).

So the basic idea with LED's is that you must control the current rather than the voltage to get acceptable results over the long run and across production variation. So, in my opinion, the best terminology for the resistor in series with the LED is "current limiting resistor." It would also be OK to say "current setting resistor."

By placing a resistor in series with the LED you are approximating a current source (assuming the voltage is fixed... if the voltage varies, it may be better to use an actual current source of some sort). Hope that helps you think about this in the right way. You will for sure find people (usually novices) who get away with driving LED's without a current limiting resistor in certain circumstances for various reasons. But it is best to have a well defined limit that you choose as part of a design process to avoid surprises.

\$\endgroup\$
0
\$\begingroup\$

The reason we refer to it as a current limiting resistor is that an LED can experience thermal runaway and runaway current at constant voltage due to it's negative temperature coefficient. Even if an LED is driven at constant voltage, it is unwise to drive it without regulating current. In the case of the resistor, the resistor has a positive temperature coefficient that balances out the negative coefficient of the LED, so as the LED tries to go into thermal runaway and draw more current, that same current heats up the resistor, increasing it's resistance and voltage drop and limiting current.

So the name of the current limiting resistor is an emphasis of the point of it. Note that current limiting resistors on high power LEDs would waste too much power to be viable, but the LED still need current control, so instead of a resistor, constant current regulators are used. In a quality constant current LED driver, output voltage is adjusted as necessary to keep output current constant.

A current limiting resistor is the cheapest and most primitive form of current protection for an LED, and when one is not present in the circuit, sometimes you will find this is because the manufacturer relied on the other limitations of the circuit(series resistance of batteries for example) to provide current control.

\$\endgroup\$
3
  • \$\begingroup\$ The temperature coefficient of a resistor isn't enough to have a noticeable effect on the current . It is linear while the changing current through the LED is exponential. The resistor is set so that the current is low enough that the LED will never get hot enough for thermal runaway to happen. \$\endgroup\$
    – JRE
    Commented Jan 19, 2021 at 8:59
  • \$\begingroup\$ @JRE The runaway current is on a curve because resistance of the LED continues to drop as the LED heats up due to the increased current. In my course we were taught that there is a minimum amount of "waste" voltage for the LED to achieve current regulation and that this waste voltage can be calculated based on the wattage heating the resistor and it's positive temperature coefficient. We were taught that if you have for example a 3.3V LED and an available 3.5V constant voltage source, It may not be enough to size a resistor to drop 0.2V and put it in series. \$\endgroup\$
    – K H
    Commented Jan 19, 2021 at 9:20
  • \$\begingroup\$ Minimum waste voltage is referring specifically to the case of a current regulating resistor. If available waste voltage is too small, especially for power LEDs, it's necessary to use a linear current regulator (Or a switching regulator if you think you can beat 94% efficiency. (Which I couldn't come close to) \$\endgroup\$
    – K H
    Commented Jan 19, 2021 at 9:29
0
\$\begingroup\$

Current limiting or voltage dropping? The answer is, yes. It's both.

The LED and series resistor (Rs) form a voltage divider.

With supply voltage (Vdd) applied:

  • IR drop across the LED is fixed (more or less) at the Vf threshold,
  • IR drop across the resistor Rs varies with supply, as (Vdd - Vf)

Since they're in series, the LED and resistor current will scale together as Vdd varies. Given a fixed Vdd, we can set the LED current by setting the resistor Rs to achieve the right Vdd-Vf IR drop, and thus set the current, as follows:

  • I(Rs) = I(led) = (Vdd - Vf) * Rs [LED and Rs are in series]

Rearranging for Rs:

  • Rs = (Vdd - Vf) / I(led)

On the other hand, if the supply voltage Vdd varies, the LED will need an active circuit to ensure a constant current. There are specialty ICs for this purpose; this can also be done with a current source built from discrete components.

\$\endgroup\$

Not the answer you're looking for? Browse other questions tagged or ask your own question.