0
\$\begingroup\$

I am wondering why 12V is the standard voltage for LED strips (RGB/W) when each SMD LED whether it be 3535 or 5050 is usually rated for ~3V.

A lot of energy is wasted as heat through the SMD resistors for each LED.

Wouldn't it be beneficial if LED strips/drivers were 3V?

\$\endgroup\$
1
  • \$\begingroup\$ Note that 5V strips are also widely available. 12 is common but not completely universal. \$\endgroup\$ Commented Oct 12, 2019 at 19:00

5 Answers 5

7
\$\begingroup\$

12V might be the "quasi" standard because of the 12V car batteries/electrics that were there years before the LED strips.

Of course no one is driving a single 3V led with 12V and a resistor directly. That would be too much energy loss.

Since there exist DC/DC converters which can transform voltage with a good efficiency from one voltage to another, it does not matter so much what voltage is used as primary source.

Another thing is that you can put the LEDs in series until you get the required voltage (4x 3V = 12V).

\$\endgroup\$
2
  • 1
    \$\begingroup\$ "... no one is driving the 3V led with 12V and a resistor directly. That would be too much loss. Since there exist DC/DC converters ..." - Are you saying LED strips are using switch-mode converters to power the LEDs from 12V? Because all LED strips I've seen do use resistors to limit the current, although often they'll have several LEDs in series to use a better proportion of the 12V supply. \$\endgroup\$
    – marcelm
    Commented Jan 15, 2020 at 22:07
  • \$\begingroup\$ I was referring to one single LED. I will update my answer accordingly... \$\endgroup\$ Commented Jan 16, 2020 at 11:18
6
\$\begingroup\$

There are lots of issues with using 3V directly. For starters, the forward voltage of an LED is related to the wavelength of light it produces, ranging from about 2V or less for red to about 3.6V or more for blue. So no single voltage is going to work for all three colours and devices that run from a single supply would have to be given enough voltage for the blue LED and regulate it down for the others anyway.

Second, LED current varies drastically with supply voltage and temperature, so trying to use exactly the right voltage supply and trying to keep it correct through wiring, connectors, switches and controllers is a non-starter, and on top of that, variations between individual devices would produce uneven illumination. So instead, they're driven with constant current and that requires a higher supply voltage that can be regulated down as needed to cope with the variations.

So given that a higher voltage is needed, it's just a case of choosing an appropriate one. 6V or 9V could work, but 12V is a widely used standard, especially in the automotive world, it was already in common use for low voltage lighting and it allows plenty of headroom for voltage drops in distribution and regulation. And if the regulation is done using buck converters the higher voltage allows for lower supply current and reduces the power loss through distribution still further.

\$\endgroup\$
3
\$\begingroup\$

In a simple configuration like this, there is a tradeoff between efficiency and accuracy. The forward voltage of each LED is slightly different, and it changes with temperature.

If you are to run the LED strip at 3V, all LEDs must be in parallel, with no individual current limiting for each LED. That would lead to uneven lighting, and probably thermal runaway and failing LEDs.

One alternative is to have a resistor for each LED, but that gets more costly. You would also still need to supply a little more than the forward voltage of the LED in order for current to flow.

A cheap and simple solution to this is to group the LEDs in series. With 3 x 3V LEDs in each group, you can have a common current limiting resistor for the group.

Burning about a quarter of the energy in the current limiting resistor seems to be the preferred "industry standard" tradeoff for cheap LED strips like this. Note that if the power supply sags under load, and supplies less than 12V, the efficiency of the LED strip will increase slightly, although that may not necessary be true for the PSU.

It would be more efficient to use a smaller resistor and run the LED strip at, say 9.5V instead of 12V, but again, that would give you less accurate control of the current.

I have myself run 12V LED strips at 9.5V, with the original resistors, and they still produce somewhat even light, albeit at reduced light intensity. I have then compensated by adding more LED strips. It is slightly more power efficient, but whether it is a cost effective or practical is a different question.

\$\endgroup\$
3
\$\begingroup\$

I am wondering why 12V is the standard voltage for LED strips (RGB/W) when each SMD LED whether it be 3535 or 5050 is usually rated for ~3V.

This will be a bit long but it turns out that this standard is a natural fit for the use case of strips.

12V white LED strips use groups of 3 LEDs and a resistor in series. With 3-3.4V on each LED, this leaves 1.8-3V on the resistor, wasting 15-25% of total power. 24V strips use groups of 6 LEDs instead of 3, and the power lost in resistors is the same, however voltage drop in the copper on the flex pcb strip is much less due to current being halved relative to 12V.

Now, why 12V or 24V? It's a compromise between convenience, cost, standardization, and efficiency.

There are two ways to make LED lights.

First you can have a small number of high power LEDs, or a COB LED, to make a point light source and a spotlight with a nice controlled beam. However all the heat comes from a small area, which complicates thermal management. In standard household lightbulbs, there is very little space, and manufacturing cost is kept to the absolute minimum, so the heat sink is tiny and inefficient and LED temperatures can reach 80-100°C.

Second option is to use lots of low power LEDs over a large area to make a diffuse light, like strips, LED tubes, or LED panels. This makes thermal management much easier since the heat is generated over a large area, and there is also a large contact area with ambient air for cooling. This has implications for efficiency. A cool LED is more efficient than a hot one. Compared to a cheap lightbulb running very hot, this makes up for a part of the power lost in the resistors of a strip.

enter image description here

Then... how to drive these LEDs?

In the first case, a small number of LEDs can be wired in series and driven by a constant current AC-DC switcher. But for a diffuse light with perhaps 50-100 LEDs or more, this is not that user-friendly. There are AC-DC switching constant current drivers designed to output high voltage like 200 volts, but these are absolutely not DIY friendly for obvious reasons, then the LEDs have dangerous voltage on them and must be isolated from fingers, etc. This type of driver targets a completely different customer (ie, industrial light manufacturer) than the LED strips which aim at DIYers, interior decoration, etc. A LED assembly inside a T8 LED tube can be at high voltage since it is inside an isolated enclosure. Strips you can glue under your kitchen cabinets in DIY aluminium profiles have to be low voltage!

So how to drive lots of LEDs without high voltage? With serial-parallel groups. However these will need a way to balance the current. Resistors can be reduced or omitted if LEDs are binned by Vf and kept at the same temperature, which results in arrangements like Zhaga. You're supposed to drive that with a constant current driver with output voltage around 40V. It's a metal core pcb so pretty good for thermal management and high output, and if you mount that on an aluminium heat sink then LEDs will be reasonably at the same temperature so current sharing resistors can be omitted. That makes slick linear lights, but it won't work for interior decoration or under kitchen cabinets, it needs a non-standard power supply, it is not flexible...

If we want a cheap flexible strip for the non-specialist market to let people and interior designers make cool stuff with we're running out of options pretty quick.

LEDs won't be binned and temperature won't be uniform, and there will be voltage drop along the length of the strip, so there will have to be current setting device for each series group of LEDs and the whole strip will have to be voltage driven. This means a standard voltage, so 12V or 24V for cost.

So as we've seen, the strip format pretty much results in constant voltage drive, the only choice we have is what device sets the current. It can be a resistor, which is cheap.

It can be a constant current linear driver too. This will waste voltage as heat, but... on a 24V strip, if you have a low voltage headroom current regulator, you can put 7 LEDs in series (7*3.2V=22.4V) instead of 6 LEDs, which halves the losses, pretty good at only 7-12% losses now. However it is a bit more expensive, and it prevents the use of flicker-free voltage/current regulated dimming, instead PWM dimming has to be used, so if we want flicker free it has to be high frequency. So, a compromise on convenience, but on the other hand it won't care about voltage drop.

But are we going to use a switching constant current driver for more efficiency? Hell no! With 7 LED per group, losses are already pretty low. With a constant current LED strip it would be a much cheaper and better choice to adjust the supply voltage down so the current regulators have just enough headroom to work, and you'll get very low losses.

A 120 LEDs/m strip, using resistors, with excellent CRI95 LEDs, costs less that USD 6/meter and much cheaper if you want garbage CRI. This means each group of 7 LEDs costs 33 cents. Adding a switching constant current driver for each LED series string would double the price of the strip, and no-one would buy it. It would also probably not work with high frequency PWM drive and would be an EMI nightmare, for no efficiency gains relative to linear constant current driver.

So we have either resistors or linear current regulators.

And if you worry about efficiency...

A GU10 LED spot gets about 70 lm/W because it runs extremely hot and the power supply is too tiny to be optimized for efficiency. A standard bulb will be about 100lm/W for the same reasons. A strip can have 100lm/W efficiency too with efficient LEDs and a decent power supply, and everything runs much cooler, so it will last longer.

\$\endgroup\$
2
  • \$\begingroup\$ Excellent answer, but where are you buying high CRI strips for "USD 6/meter"? I'd like to get in on that. I'm seeing prices that are double yours (though I'm only looking at strips that have a photometric report. Advertised specs tend to be lies) \$\endgroup\$
    – Navin
    Commented Apr 12, 2021 at 5:16
  • \$\begingroup\$ @Navin These strips are excellent, independently tested by a guy with a spectrophotometer on budgetlightforum.com, who confirmed they deliver the promised specs. \$\endgroup\$
    – bobflux
    Commented Apr 12, 2021 at 6:50
0
\$\begingroup\$

One more reason to point out is that the strips are quite resistive (being "thin" strips of copper) and are usually ran through long lengths.

That causes a considerable voltage drop, so with 12V you have plenty of margin for that. Granted, you can see the voltage drop on very long runs (i.e. brightness or color shift) but its better than simply not having enough voltage to work after a few meters.

\$\endgroup\$

Not the answer you're looking for? Browse other questions tagged or ask your own question.