I'm revisiting an old project, and I want to see if there is anything I could have done differently, or if I missed any important details or considerations.
Basically, I used a constant current driver (designed for LEDs) to power and dim a halogen bulb. The driver is a MeanWell HLG-100H-20AB, and the bulb is 35W, 12V halogen. This constant current driver can be controlled via variable resistance (10kOhm at min. brightness and 100kOhm at max. brightness), and its max current output can be adjusted as low as 3A. I calculated that the bulb needs 2.917A to shine at its maximum, so I set the max current on the MeanWell to ~3A and arranged my potentiometer and resistors such that the max resistance delivered would be ~97.5kOhm, meaning, theoretically, that the Meanwell unit should not output more than 2.917A.
Here are my questions:
- This worked well from what I saw. The light dimmed as I expected, and the bulb didn't burst. But, I'm wondering if there's anything crucial that I could have missed. I don't generally see people dimming halogen or incandescent bulbs with constant current devices, and there must be a reason for this. Any ideas?
- I'm assuming that if I had used the MeanWell to output, lets say 4A, the bulb would have failed, since my understanding is that the MeanWell unit is capable of "force feeding" current. Is this actually true? Or, does the unit just modulate voltage to indirectly control current?
- I noticed that the spec sheet has a "constant current region" listed between 10V and 20V. I don't understand what this means, or how it would affect the halogen bulb. For instance, when running only 1.5A to the bulb, for example, its a safe bet that the voltage across the bulb's termninals is less than 10V. Yet, the MeanWell unit still supplied steady current. What gives?
- Am I correct in thinking that this setup negates voltage drop considerations for the bulb over a long cable?
Thanks!