I am designing a simple three-phase inverter to begin working in power electronics. I have no experience in power electronics and minor experience designing circuits. While working on it, I encountered a problem in my understanding of a fundamental concept: the current limiting resistor.
This is the first phase of my inverter; it runs on 12 volts and 2 amp current. I am having trouble with the R7 resistor. During my perf board test, I eyeballed the resistor to 100 ohms, and the resistor burned up. This made me wonder if the resistor was taking the complete load of 2 amp (as there was no load to attach to the output). According to the ohms law, R = (12-2)/2, I should use a resistor of 5 ohms. This seemed low for a circuit burning 100 ohms resistor. Upon reading a few blog posts and some StackExchange queries, I realized that I had the wrong approach and I was supposed to consider the forward current of the LED (right?). Redoing my calculations, I came up with a number of approximately 500 ohms. But the topic of power dissipation keeps throwing me off. Won't the resistor burn up because of the 2-amp flowing out of the inverter? In case of high current, how can you add a LED for the indicator, and what kind of things should designers keep in mind?