Short version: I'm thinking about running LEDs in a potential over-current setup, but with a PWM-based supply. Will the LEDs suffer an early death? Said otherwise, do LED die of overheating, or really because of the over-current?
Long version: My setup consists of the following parts, all coming from various low-cost Internet shops - so no datasheet or reliable spec is available for any.
- One LED strip. Composed of 12 modules in parallel. Each module contains 4 components in series: one 39ohm resistor and 3 5630 white LEDs. My understanding is that it's supposed to be fed by a 12V DC supply. When doing so (with a bench power supply), each LED operating point is around 3.3V/52mA, for a total of 7.6W for the whole strip (when the resistors are considered).
- One LED PWM driver. A small $1 module that can generate blink patterns or dims the LEDs via an 1.5kHz PWM (confirmed with an oscilloscope).
- One crappy wall wart, specified at 12V, 1.5A, but which outputs ~17V when there's no load, and around 13.25V when feeding the LED strip, operating each LED around 3.6V/60mA.
I've failed to locate any datasheet for a 5630 LED that has that kind of operating point. In the 12V setup, each LED dissipates ~175mW, which seems an odd value. Assuming that they're really 1/4W LEDs (even if most 5630 appear to be 1/2W), the 3.6V/60mA operation point would still be under that, albeit marginally.
Back to the question: I'm trying to speculate how long these would live if I was to operate them at, say, 80% duty cycle - which would probably end up being at a voltage/current even a bit higher than the numbers above. Can anyone provide some background of the physics of LED wear?
Obviously, adding an ~1 ohm, 1 watt resistor in series would solve everything, but where would be the fun in doing that...