I'm attempting to build a near-infrared (850 nm) lamp using high-power LEDs and am struggling to design a suitable way to power them directly from the AC power plug (220V).
I have 10 high-power LEDs with the following specs:
VF = 1.4 - 1.8 V;
IF = 400-700 mA.
I was thinking about powering them with a simple USB phone charger (5V, 1A) as in the following diagram:
simulate this circuit – Schematic created using CircuitLab
In this setup, each LED should receive 1V (slightly lower than the specified 1.4V in the datasheet) and 500 mA current (within the LED's current range). However, I'm uncertain if this is the most appropriate configuration. I wonder whether I should use a different adapter (e.g. I have a 12V-2A available but that is probably too much). I have two main doubts:
Is it a problem to supply less voltage to each LED (1V instead of 1.4V)? Should I consider reducing the number of LEDs?
I've seen that it's good practice to insert an additional current-limiting resistor in series with high-power LEDs (even though this will cause further voltage drop). What resistance should these resistors have?
I'm open to the possibility of regulating the brightness of LEDs, for example, with a potentiometer downstream of the current source, but I guess this would require changing the driver circuit. I'm just starting in the field of electronics.