The permeable iron alloy cores of induction motor rotors are formed from thin laminated sheets, designed to prevent eddy current flow and thus "reduce losses", but isn't the whole point of an induction motor rotor (e.g., squirrel cage) to have eddy currents?
The eddy currents are what result in the Lorentz force exerted on the moving charges in the rotor and thus cause the motor to spin. Larger currents result in a larger torque, or less current for the same torque. If the core wasn't laminated, the same stator magnetic field would induce a larger rotor current, which would result in a larger magnetic force. More current would also be drawn by the stator too.
What makes eddy currents in the rotor so desirable (they make the motor move), but eddy currents in the rotor core so undesirable? Is it because higher resistance reduces the power losses:
$$P = \frac{\varepsilon^2}{R} \propto \frac{1}{R}\cdot\left|\frac{d\Phi}{dt}\right|^2$$
Increasing the resistance of parts of the rotor decreases power loss there, but also reduces the overall torque for the same stator field. This seems like a trade-off. Why does this trade-off fall in favor of high resistance rotor core and low resistance rotor cage?
To summarise: we clearly want the cage to have low resistance. What justification for low resistance applies to the cage but not the magnetic core? If you didn't use silicon steel, the iron material would be about 5 times less conductive than a copper cage, but the larger cross sectional area might make the resistance still less!