1
\$\begingroup\$

I am currently attempting to build a DIY driver board to power the backlight of an LM270WQ1 panel.

Per the datasheet, there are six LED strips, each of which take 36V @ 350mA.

The adapter board that provides HDMI->eDP comes with an LED driver board, but it only provides up to ~32V @ 350mA, and thus the maximum brightness of the panel is quite dim (around 150 nits by my estimate compared to a max brightness of 420 nits).

I am interested in an efficient way to build a current-limiting circuit to provide 350mA to each of the six LED strips in parallel. The easiest solution seems to be to simply put a ~103 ohm resistor in each strip, but per my math that would lead to ~75W of heat loss (edit: this is incorrect).

Admittedly, I am new to the world of circuit design, so I may be misinterpreting something.

I do not need support for PWM dimming, as I plan to use the panel at max brightness. I am mainly interested in a balance of efficiency and cost (ideally under $20 or so). I can make the board and solder myself. I already have a boost converter that can boost the 12V for the stock LED board up to 36V.

I have attached the relevant datasheet portions below. Thanks for looking.

LED Datasheet

LED Pinout

\$\endgroup\$
7
  • \$\begingroup\$ The most cost efficient would be to just buy one. Otherwise there is any number of led driver ics that would fit this standard setup. Look at TI's led driver options. Many include the boost circuit as well. \$\endgroup\$
    – Passerby
    Commented Oct 7, 2022 at 2:57
  • \$\begingroup\$ Buy an external LED driver which fits your needs and interface it to your controller? \$\endgroup\$
    – winny
    Commented Oct 7, 2022 at 9:20
  • \$\begingroup\$ My main question/problem in that case is that there are six independent LED channels. If I power them off one 36V 2.1A controller then one could potentially be overcurrented, so I'd need six 350mA controllers (or one 6 channel controller) which adds up quickly in cost and size. I may look in to one of those TI led drivers, thank you. \$\endgroup\$ Commented Oct 7, 2022 at 13:10
  • \$\begingroup\$ Also, I don't mind just having a fixed voltage, current limited channel, as I think that will work to protect the LEDs, but I'm not sure if there's a way to do that efficiently. \$\endgroup\$ Commented Oct 7, 2022 at 13:20
  • \$\begingroup\$ Your resistor calculation must wrong. Those 36V are obviously very close to the desired LED voltage, so you'll only need to drop a few volts accross the resistor. The power dissipation at 350mA will be around 1W, quite doable and certainly the cheapest solution (if one resistor can't stand that power, just use two or more). \$\endgroup\$
    – Sim Son
    Commented Oct 7, 2022 at 16:49

1 Answer 1

1
\$\begingroup\$

Your calculation for the resistors needed is incorrect.

To limit the current in an LED string you would need to know how much voltage there is across the string, how much voltage there is from the supply and how much current the string takes.

The resistor needs to drop the difference between the supply voltage and the LED voltage at the LED current. To be able to regulate current you need to have a higher supply voltage than the required load voltage in order to have some headroom. For example to get 36 V at the load you might use a 40 V supply, to figure the resistance needed you would use the difference in voltages:

$$R = \frac{40V-36V}{0.35A} = 11.428\Omega$$

And the power dissipated in the resistor would be: $$ P = (0.35A)^2\times11.428\Omega = 1.4W $$

The value you calculated was the power the LED strings would dissipate, \$36V\times0.35A=12.6W\$ per string.

A resistor by itself won't actually 'limit' current to a specific value unless the resistance is a significant portion of the total load.

Let's look at what happens if we use a 40 V supply and 12\$\Omega\$ resistors. We'll look at the current for 3 strings, one with the specced 36 V drop, and two that are a bit off, 35 V and 37 V. Ideally you want the resistor to equalize the current through each string so they have the same brightness.

@35 V \$I = (40V-35V)/12\Omega = 416mA\$
@36 V \$I = (40V-36V)/12\Omega = 333mA\$
@37 V \$I = (40V-37V)/12\Omega = 250mA\$

Hardly the results we want. This is because the LED strip impedance is around 103\$\Omega\$ so 12\$\Omega\$ is a small fraction of the total.

If we upped the resistor values to 120\$\Omega\$ it will be more than half the load, but at 350mA they're going to drop much more voltage, around 42 V. That makes the required supply voltage \$36V + 42V = 78V\$.

Using that in our previous calculations we get:

@35 V \$I = (78V-35V)/120\Omega = 358mA\$
@36 V \$I = (78V-36V)/120\Omega = 350mA\$
@37 V \$I = (78V-37V)/120\Omega = 341mA\$

This is much better current regulation, but at the expense of a much higher supply voltage and ~15 W dissipation per resistor.

Resistor current balancing for LEDs works pretty well when dealing with single low current LEDs such as in a 7 segment display, where the LED impedance is around 70\$\Omega\$ and you use 180\$\Omega\$ resistors @ 20mA from a 5 V supply. For high current LED strings, not so much.

A better method than a simple resistor would be a current regulator or constant current source (CCS), this can be made with a couple of transistors, or for more control an opamp reading the voltage across a sense resistor and driving a transistor. You can see this Wikipedia page to get some ideas of how to do that.

Here's a simulation of a basic example of a current limiting circuit just to give an idea of how it works. I'm not guaranteeing it will work in practice without some modification. LED CCS LED CCS Plot

The voltage across each 5.6\$\Omega\$ resistor is held constant by the voltage reference at approximately \$2.5V - V_{be}\$. Since the voltage and resistance are constant the current must be constant too to satisfy Ohm's Law. The voltage sources V2 to V4 represent 3 LED strips with the maximum range of voltage drop from the datasheet. The supply voltage is swept from 44 V to 52 V. You can see that the currents stay pretty close to 350 mA. 48 V supplies are common, so one of those could be used to power it. The transistors would need to be on appropriate heat sinks.

\$\endgroup\$
5
  • \$\begingroup\$ Thank you for this, very useful information. I will look into these transistor-based solutions. I do have a question though: if the 36V LED strip has an impedance of 103 ohms, why can't I just attach 36V to it and per V=IR get 350mA current? I know it will burn out, but I'm not sure how LEDs work scientifically. I thought LEDs had negligible impedance, but this doesn't seem to be the case. \$\endgroup\$ Commented Oct 7, 2022 at 20:35
  • \$\begingroup\$ An LED has a forward voltage drop, putting many in series adds these drops. For your strips they add up to ~36V. The problem is that LEDs are very sensitive to voltage, a little bit higher and they draw more current, a little bit lower they draw less. If you get one strip that's a bit low (the spec sheet says as low as 31V) it can draw excessive current and burn out. This is if they are just LEDs wired in series with no built in resistors or current limiting (I don't know if yours would have that). When I talk about their impedance I mean the equivalent resistance that would draw 350mA @ 36V. \$\endgroup\$
    – GodJihyo
    Commented Oct 7, 2022 at 21:06
  • \$\begingroup\$ Thank you, I figured that might the case. I don't believe there are any built in resistors or current limiting, I will try and see if there is any. \$\endgroup\$ Commented Oct 7, 2022 at 21:14
  • \$\begingroup\$ @TheForgottenKing I added a circuit to demonstrate one possible way of doing it. \$\endgroup\$
    – GodJihyo
    Commented Oct 7, 2022 at 21:33
  • \$\begingroup\$ Thank you, I will try to design something like that when I can and report back. – \$\endgroup\$ Commented Oct 8, 2022 at 3:23

Not the answer you're looking for? Browse other questions tagged or ask your own question.