2
\$\begingroup\$

I have some extra 1n4007 diodes, and was wondering whether, instead of using a power supply to drop the voltage, could I could power LEDs by just making a full wave rectifier, Using a capacitor to smooth out the power and powering n LEDs in series?

Here is the spec sheet: http://www.jameco.com/Jameco/Products/ProdDS/2095171.pdf

These LEDs are rated to go as high as 2.6V (red) or down to 1.9V. Could I create a series of 120/2.1 = 58 of them? With voltage going up as high as 128V, that's still 2.2V, well within spec.

This would mean that each LED would be less bright than maximum but would run cooler. Would that result in less efficiency? From the spec sheet, it looks like within a fairly large range, light output is pretty linear.

The current for this would be somewhere in the 20-30mA range. Aside from the potential for failure of one LED in series to kill the light, is this a reasonable hack? It seems to me that it's at least very efficient. The only losses I can see in this circuit would be in the diodes. I should see perhaps a 1V drop in the 1n4007?

Is there anything more efficient I can do?

\$\endgroup\$

4 Answers 4

3
\$\begingroup\$

What you have just described is most if not all cheap led christmas lights.

Diode bridge rectifier, capacitor, and a few parallel strings of multiple leds in series. Maybe a fuse. That's literally it.

See http://www.ciphersbyritter.com/RADELECT/LITES/LEDLITES.HTM for a full primer.

\$\endgroup\$
2
  • \$\begingroup\$ Most of them skip the capacitor, and as a result, they flicker like crazy. \$\endgroup\$
    – Dave Tweed
    Commented Apr 17, 2014 at 0:56
  • \$\begingroup\$ I noticed that my LED bulbs from CREE get very hot at the base. Obviously their white modules are the latest thing, much more efficient than my LEDs, but why wouldn't a bigger commercial light do this to avoid dissipating so much heat? It seems to me at least 50% of the energy is being wasted in the circuit designed to protect the LEDs in these compact bulbs \$\endgroup\$
    – Dov
    Commented Apr 17, 2014 at 1:30
2
\$\begingroup\$

Your hack might work. Then again, you may kill some LEDs. The problem is called "thermal runaway". Look at the data sheet. As the LED gets hotter, its forward voltage drops. For a fixed voltage, this means the current goes up (and it goes up fast). This means the LED gets hotter, since it's dissipating more power. This means ..... That's why you need a resistor or a controlled current source. As described, with no resistor I suspect you'd kill some LEDs. Of course, one advantage to putting LEDs in series is that, with a little luck, you'd only kill one at a time.

\$\endgroup\$
1
\$\begingroup\$

I've built some ornaments that run directly off of AC. Basically, you need a full-wave rectifier (unless you want a lot of flicker), the LEDs, and then a dropping resistor (or resistors) to handle the current. You do not want a diode; you need something to set the current.

Here's how to do it:

  1. Take 110 volts and divide it by the forward voltage of the LEDs, and get rid of any fraction.
  2. Multiply the number of LEDs by the forward voltage to get the total voltage dropped by the LEDs.
  3. Subtract this number from 120 volts. This is the voltage that you need to drop in the resistor.
  4. Figure out the current you want from the datasheet.
  5. Figure out the resistance you need, using V=IR ==> R = V / I. If you put this resistance in line with the series LEDs, you will get the proper current.
  6. Figure out the wattage dissipated by the LED, using W = VI. You will likely have to use multiple resistors so that you don't exceed the resistor wattage.

The reason for using 10 volts for the resistor instead of making it as small as possible is to make the current more constant. If the resistor is small, the current gets much more sensitive to manufacturing differences in the forward voltage of the LEDs and the actual line voltage.

Oh, and obviously, you need to do a very good job of insulating these things, because you're dealing with possibly fatal voltage.

\$\endgroup\$
1
  • \$\begingroup\$ what's wrong with the plan I described given that the max voltage for each LED is much higher? If voltage went to 130V, would current go too high? I purposely derated to avoid needing any resistor \$\endgroup\$
    – Dov
    Commented Apr 17, 2014 at 4:54
0
\$\begingroup\$

I bought some LED icicle type strings powered by a controller. I reverse engineered the controller and basically it was a full wave bridge rectifier turning 120 Volt AC into DC. There was some gating IC's that controlled thyristors that actually pulsed 2 separate 75 LED series strings from the unfiltered 120 Volt DC. I saw no fuse or current limiting resistor, so probably not the safest design should something fail by shorting out.

I did not like the LED's to be pulsed in any manner so I chose to remove the thyristors and replaced them with 1200 Ohm 1 watt resistors. This limited the current in the strings to about 10 mA and produced decent light at night without the annoying pulsating. It has been operating for several days now with no issues. Thinking of adding 1/8th watt 1 Ohm resistors to each side of the AC feed that would likely act as a fuses.

\$\endgroup\$

Not the answer you're looking for? Browse other questions tagged or ask your own question.