I am powering a 12V, 35W halogen bulb with a MeanWell HLG-100H-20B constant current power supply. The "B" version of this power supply has a really neat dimming control system which lets one control the output current via a potentiometer, a variable voltage source, or PWM. This MeanWell unit can pump out 4.8A at its maximum setting, which is obviously too much for my 35W, 12V bulb. To reach full brightness on my bulb, I only need roughly 3A (~60% current from the MeanWell unit). Unfortunately, the "B" version of this unit does not have an adjustable maximum output current like the "AB" version does, and the "AB" is not in-stock anywhere.
So, regardless of how I control the output current, I can't let it get past 3A. This is pretty easy to do (I can just limit my PWM pulse duration or pot resistance, etc.), but my fear is that a failure or a goof-up in the control circuit could make the output current higher than 3A. I do have DC fuses to prevent the lights from blowing, but I'd like to avoid replacing fuses as much as possible.
Is there any sort of module method for limiting the current at a specified value? If my PWM source goofs up and the MeanWell unit starts outputting 4A, for example, I'd want something to cap the current going to the lights or fuses at 3A. Perhaps some sort of current divider circuit with a transistor or two to open an additional circuit path?
Thanks in advance for any advice. By the way, I'm a mechanical design engineer with limited electronics experience (if you couldn't tell) so please let me know if I'm spewing nonsense here.
EDIT: I think I figured it out, like 10 minutes after posting this question. See my crappy paint drawing below. Basically, I'll put 6.87 ohms of resistance in parallel with the bulb+fuse. The current division means that when the system is at max power (4.8A) the bulb's path will only experience ~3A. This will scale linearly with output current.