0
\$\begingroup\$

I am designing a simple three-phase inverter to begin working in power electronics. I have no experience in power electronics and minor experience designing circuits. While working on it, I encountered a problem in my understanding of a fundamental concept: the current limiting resistor.

enter image description here

This is the first phase of my inverter; it runs on 12 volts and 2 amp current. I am having trouble with the R7 resistor. During my perf board test, I eyeballed the resistor to 100 ohms, and the resistor burned up. This made me wonder if the resistor was taking the complete load of 2 amp (as there was no load to attach to the output). According to the ohms law, R = (12-2)/2, I should use a resistor of 5 ohms. This seemed low for a circuit burning 100 ohms resistor. Upon reading a few blog posts and some StackExchange queries, I realized that I had the wrong approach and I was supposed to consider the forward current of the LED (right?). Redoing my calculations, I came up with a number of approximately 500 ohms. But the topic of power dissipation keeps throwing me off. Won't the resistor burn up because of the 2-amp flowing out of the inverter? In case of high current, how can you add a LED for the indicator, and what kind of things should designers keep in mind?

\$\endgroup\$
2
  • \$\begingroup\$ 1k ohms seems like an appropriate value for that supply voltage. What power resistor were you using? 12V over 1k ohms is 144mW, and it'll be less after the LED takes its forward voltage out. If it's burning up, then I would check that you have read the value correctly (brown, black, red, gold [5%] or red [1%]). \$\endgroup\$
    – vir
    Commented Apr 12, 2023 at 17:44
  • \$\begingroup\$ Initially, I used 100 ohms. I changed it as soon as possible, so the schematic now shows 1k. I am not sure about the power rating, as I bought the first available resistor from the shopkeeper. \$\endgroup\$
    – Momobear
    Commented Apr 12, 2023 at 18:10

2 Answers 2

4
\$\begingroup\$

Your problem seems to be that you assume the output will always be 2A, but that assumption is not only wrong, I don't even know where you got that figure of 2A from. What you have built is not outputting 2A, it's only capable of outputting 2A.

In fact all it's doing is applying either 0V or 12V (depending on the state of their gates, as controlled by the IR2110 driver IC U1) at the junction between the two MOSFETs (your "output"). It's not up to those MOSFETs (or anything controlling their gates) to decide what current actually flows into the load. That's decided by the load.

So, if you connect a 100Ω resistor from that output to ground, and you set the output high (+12V), it's Ohm's law that decides the current that flows in the load:

$$ I = \frac{V}{R} = \frac{12V}{100\Omega} = 120mA $$

Not 2A, just 120mA. If you connect a series of LED (that drops 2V) and 1kΩ resistor, the current that will flow will be:

$$ I = \frac{V}{R} = \frac{12V-2V}{1000\Omega} = 10mA $$

Again, it's not 2A. Current is determined by the load which draws only 10mA in this instance.

Now, if you want 2A to flow, you have to attach a load that draws 2A when it has 12V across it. For example, a single 6Ω resistor, between output and ground, will do exactly that:

$$ I = \frac{V}{R} = \frac{12V}{6\Omega} = 2A $$

I reiterate, you have created a very high current capable output using those MOSFETs, but the MOSFETs themselves only decide what voltage gets applied at the output, not what current will be drawn from that output. That current will be decided by the load you connect to it.

You say that a 100Ω resistor burned up. Well, if you connect a 100Ω resistor and 2V LED in series, between output and ground, the resulting current through them will be:

$$ I = \frac{V}{R} = \frac{12V-2V}{100\Omega} = 100mA $$

That's enough to destroy a small ¼W resistor, since the power it dissipates will be:

$$ P = I^2R = (100mA)^2 \times 100\Omega = 1W $$

\$\endgroup\$
6
  • 1
    \$\begingroup\$ I understand it now. Thank you for your time and efforts. \$\endgroup\$
    – Momobear
    Commented Apr 13, 2023 at 6:09
  • \$\begingroup\$ @GodJihyo by "this guy", do you mean me? \$\endgroup\$ Commented Apr 13, 2023 at 13:25
  • \$\begingroup\$ @GodJihyo I start writing, spend an age formatting and clarifying, and submit. There's no plagiarism of your own answer. If it looks similar, it's because this question is so basic there are only a few things this answer can possibly say. If this has happened before, it has to be for the same reason, or because I I'm writing concurrently with other answerers. And if I got rep and the green tick it's because of the work I put into the answer, not because I've copied anything. I haven't. \$\endgroup\$ Commented Apr 13, 2023 at 13:36
  • \$\begingroup\$ @GodJihyo For the record, you openly accused me of plagiarising your own answers three times, for reputation, then deleted that comment when I objected. No worries? Yes worries, I bloody well worry about that. \$\endgroup\$ Commented Apr 13, 2023 at 13:45
  • \$\begingroup\$ @SimonFitch I apologize, maybe it's just coincidence and I overreacted. I don't always say things in the most diplomatic ways. Sorry. \$\endgroup\$
    – GodJihyo
    Commented Apr 13, 2023 at 14:29
2
\$\begingroup\$

A common mistake for people new to electronics is to confuse constant voltage and constant current supplies. A lot of people think that because a supply is rated at a certain current then it will force that current through anything you connect to it. This is only true if it is a constant current supply (and the load is within the supply's capabilities).

Most of the supplies you will deal with will be constant voltage type. These will output a rated voltage at up to a rated current, key phrase being up to. For a constant voltage supply the load will draw a current that depends in it's impedance. For example for a 12 V / 2 A supply a 500\$\Omega\$ load will draw 12 V / 500\$\Omega\$ = 24 mA. The resistance will dictate the current, not the supply. In your case you have 12 V and a 2 V LED, so the voltage across the resistor will be 10 V. If you want a current of 20 mA through the resistor and LED that will be a 500\$\Omega\$ resistor like you correctly calculated. Any other load you have connected in parallel will draw a separate current, for example if you had a 12\$\Omega\$ resistor also connected it would draw 1 A, the LED would draw 20 mA, and the supply would output 1.02 A.

The wattage dissipated in the resistor will be I\$^2\$ * R, or V\$^2\$ / R. That comes out to 0.2 W, so you could use a 1/4 W resistor or if you want to leave some extra safety margin 1/2 W.

Using a 100 \$\Omega\$ resistor would have gotten you 100 mA, and a wattage of 1 W, so it's no surprise your resistor burned up if it was rated less than that.

\$\endgroup\$
1
  • \$\begingroup\$ Okay, I see. You are right. I was confused that a set current from a supply forces it on a circuit. Thank you for your answer and your time. \$\endgroup\$
    – Momobear
    Commented Apr 13, 2023 at 6:25

Not the answer you're looking for? Browse other questions tagged or ask your own question.