Eddy currents induced in let's say a metal increase with the increase of rate of change of magnetic field. They increase with a square of both peak magnetic field and frequency.
Now imagine a system with one coil and a metal being put next to it so that the eddy currents are induced inside of it that are heating the metal. If I have the coil connected to a power grid with a 50 Hz frequency, the eddy currents will be smaller than if I had it on 100 Hz. But what puzzles me with this is that if I just increase the frequency of the power grid, the eddy current losses will skyrocket, while power expended calculated by P=U*I will stay practically the same.
If I increase the frequency by a lot I would by this logic break the 100 % efficiency. So what actually happens here? Where does the additional power come from when I greatelly increase the frequency?