If the bulb obeyed Ohm's law, the resistance would not change when the voltage is changed. So when the voltage is halved the current halves. Method 2 would then be correct, as would use of $P=IV$ with both $I$ and $V$ half their value at 220 V.
If your bulb is a filament lamp, the resistance increases with applied voltage, as the filament gets hot. [The increase can be in the order of 10-fold as the filament gets white hot.] This implies that a graph of current against voltage rises less and less steeply as the voltage increases. So when the voltage is halved from the usual working voltage the current decreases but to 80% (say) rather than 50% of its previous value. The lamp is seriously 'non-Ohmic'.
You have assumed in your Method 1 that the current does not change at all when the voltage is halved. This is wrong, but your answer of 50 W may be closer to the actual power at 110 V than the 25 W based on assuming Ohm's law!
I've just done the experiment (using two old filament lamps, both rated at 230 V, 60 W). Putting 230 V across either lamp gave a current of (0.240 ± 0.005)A, so a power of (55 ± 2) W. Putting 230 V across the two lamps in series (so each got 115 V) gave a current of (0.160 ± 0.005)A, so each bulb's power was (18 ± 1)W. This is more than a quarter of the power at 230 V, (the Ohm's law prediction) but less than half (the constant current prediction). So as expected.
Of course, for several years now, lamps rated at several tens of watts, based on incandescent filament technology, have been obsolete. I'm not sure why Physics students are supposed to live in the past!