I've had this conceptual problem with Faraday's law and inductance for a while now.
Take the example of a simple current loop with increasing area in a constant field (as in this answer). So Faraday's law states that the increasing flux (due to the increasing area) causes an EMF and hence a current. Due to the minus sign in Faraday's law, or by Lenz's law the direction of the current is such that the magnetic field it creates opposes the external field.
Now why do we never consider the magnetic field created by the induced current, when calculating the change in flux? All workings I have seen always calculate the flux from $\underline{\mathbf B} \cdot \underline{\mathbf A}(t)$. Why is $\underline{\mathbf B}$ not adjusted by the induced magnetic field? Is it just that small that we can neglect it unconditionally?
I have the same problem with self-inductance in AC circuits (although, maybe if I understood the above problem, this would become apparent to me as well). Say we start from current $I=0 \text{A}$. Then the EMF in the circuit increases (but is still very low), which increases $I$, which in turn creates an increasing $\underline{\mathbf B}$ inside the coil. Wouldn't the induced counter-EMF be much greater than the external EMF applied to the circuit? And if so, how come there is current moving in the first place, if the slightest increase in EMF causes a counter-EMF which acts to stop the current?
Is it just that I am looking at idealised situations or that the magnitudes of the external and induced effects differ greatly? Or do I have a conceptual misunderstanding about how (self-)inductance works?