I've got a basic question on the method of computing the average induction voltage. Here is the task:
A conductor loop with the area $A_1 = 50\text{cm}^2$ in a magnetic field with the flux density of $B_1 = 0.2 \text{T}$ is downsized to $A_2 = 5\text{cm}^2$ and at the same time the flux density is reduced to $B_2 = 0.1\text{T}$. This all happens in $\Delta t = 0.1\text{s}$. Compute the average voltage in that loop.
My teacher and I have come up with two different way of computing it. I want to know which way is the right one.
Way 1:
$$U_{\text{ind}} = -\frac{\Delta \Phi}{\Delta t} = -\frac{\Phi_2-\Phi_1}{\Delta t} = -\frac{A_2B_2-A_1B_1}{\Delta t} = -0.0095 \text{V}$$
Way 2:
$$U_{ind} = -\frac{d\Phi}{dt} = -\left(\frac{dA}{dt}B+\frac{dB}{dt}A\right) = -\left(\frac{\Delta A}{\Delta t}B+\frac{\Delta B}{\Delta t}A\right) = -\left(\frac{A_2-A_1}{\Delta t}B_1+\frac{B_2-B_1}{\Delta t}A_1\right) = 0.014 \text{V}$$
Obviously these are not equal. My teacher said that the second one is correct. If he is right, why is the first way wrong?