1
$\begingroup$

I have this question:

An electric bulb is rated 220 V and 100 W. When it is operated on 110 V, the power consumed will be: (a) 100 W (b) 75 W (c) 50 W (d) 25 W

I use P=VI to find Current when Power is 100 W and Voltage is 220 V, then I find Current = 5/11 A. And then I use P=VI to find Power when Voltage is 110 V and Current is 5/11 A. I find that the Power is 50 W.

But when I use P=V²/R to find Resistance when Power is 100 W and Voltage is 220 V, then I find Resistance to be 484 Ω. Then I use P=V²/R to find Power when Voltage is 110 V and Resistance is 484 Ω. I find that the Power is 25 W.

Our teacher has used the later method. Why can't we use the first one and why is it wrong?

$\endgroup$
2
  • 7
    $\begingroup$ Actually, all the answers are wrong for a filament light bulb, because the resistance changes with temperature. But I suppose the question pretends you don't know that.. $\endgroup$
    – alephzero
    Commented Jul 11, 2021 at 15:53
  • 4
    $\begingroup$ You can use $P=IV$, but you must use the correct $I$ and the correct $V$ ! $\endgroup$ Commented Jul 11, 2021 at 18:15

2 Answers 2

6
$\begingroup$

When you connect the bulb to 110v, the current, I, will not be 5/11.

We know $I=V/R$, so when voltage changes, amps will consequently change.

It is better, for this problem, to use $P=V^2/R$ because it has two variables ($P and V$), while in $P=VI$, we have three variables.

$\endgroup$
2
  • 1
    $\begingroup$ Thanks, it helped a lot! $\endgroup$
    – Arunima
    Commented Jul 11, 2021 at 16:01
  • $\begingroup$ The bulb resistance when cold is very very different when hot. So if you try to use the resistance measured when the bulb is off to determine the power when it is on, your answer is going to be wrong. In a real light bulb anyways. It's just a badly formulated exam question. It should have just said "resistor" instead of light bulb. $\endgroup$
    – DKNguyen
    Commented Jul 11, 2021 at 18:21
4
$\begingroup$

If the bulb obeyed Ohm's law, the resistance would not change when the voltage is changed. So when the voltage is halved the current halves. Method 2 would then be correct, as would use of $P=IV$ with both $I$ and $V$ half their value at 220 V.

If your bulb is a filament lamp, the resistance increases with applied voltage, as the filament gets hot. [The increase can be in the order of 10-fold as the filament gets white hot.] This implies that a graph of current against voltage rises less and less steeply as the voltage increases. So when the voltage is halved from the usual working voltage the current decreases but to 80% (say) rather than 50% of its previous value. The lamp is seriously 'non-Ohmic'.

You have assumed in your Method 1 that the current does not change at all when the voltage is halved. This is wrong, but your answer of 50 W may be closer to the actual power at 110 V than the 25 W based on assuming Ohm's law!

I've just done the experiment (using two old filament lamps, both rated at 230 V, 60 W). Putting 230 V across either lamp gave a current of (0.240 ± 0.005)A, so a power of (55 ± 2) W. Putting 230 V across the two lamps in series (so each got 115 V) gave a current of (0.160 ± 0.005)A, so each bulb's power was (18 ± 1)W. This is more than a quarter of the power at 230 V, (the Ohm's law prediction) but less than half (the constant current prediction). So as expected.

Of course, for several years now, lamps rated at several tens of watts, based on incandescent filament technology, have been obsolete. I'm not sure why Physics students are supposed to live in the past!

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.