0
$\begingroup$

I'm studying from a Deep Learning book (Ian Goodfellow et al). At page 256 the text explains that, considering a set of $k$ regression models, each produces an error $ϵ_i$​ for every example, drawn from a zero-mean multivariate normal distribution with variances $E[ϵ_i^2]=v$ and covariances $E[ϵ_iϵ_j]=c$. Then the error made by the average prediction of all the ensemble models is $\frac{1}{k} \sum_{i=1}^k \epsilon_i$. The expected squared error of the ensemble predictor is:

$ \begin{align*} E\left[\left(\frac{1}{k} \sum_{i=1}^k \epsilon_i\right)^2\right] &= \frac{1}{k^2} E\left[\sum_{i=1}^k (\epsilon_i)^2 + \sum_{i \neq j} \epsilon_i \epsilon_j\right] \ &= \frac{1}{k} v + \frac{k-1}{k} c. \end{align*} $$

So, the question is: How do we obtain the first quadratic decomposition? I'm having trouble with it. Could someone explain?

Thank you so much!

$\endgroup$
5
  • 1
    $\begingroup$ Just write it out for small $k$. For $k=2$, say, all this claims is that $(a+b)^2=a^2+b^2 +ab+ba$. This has nothing to do with expectation, beyond the use of linearity to claim that $E\left( \frac x{k^2}\right)=\frac 1{k^2}\times E(x)$. Of course the second equivalence uses properties of expectation. $\endgroup$
    – lulu
    Commented Mar 29 at 12:06
  • $\begingroup$ Sorry, but i didn't understand.. $\endgroup$ Commented Mar 29 at 16:55
  • $\begingroup$ What don't you understand? Just write it out for small $k$. In my example, I used $a,b$ instead of $\epsilon_1, \epsilon_2$, but of course you should use whatever notation you like. My point was just that there's nothing going on here but squaring. $(\epsilon_1+\epsilon_2+\cdots+\epsilon_n)^2=\sum \epsilon_i^2+\sum_{i\neq j} \epsilon_i\times \epsilon_j$ just as in my example with $a,b$. $\endgroup$
    – lulu
    Commented Mar 29 at 17:02
  • $\begingroup$ If you liked, you could write that second sum as $2\sum_{i<j} \epsilon_i\times \epsilon_j$, just as $ab+ba=2ab$. Maybe that looks more intuitive? $\endgroup$
    – lulu
    Commented Mar 29 at 17:03
  • $\begingroup$ That's more intuitive!!! Thank you so much! $\endgroup$ Commented Mar 29 at 17:30

0

You must log in to answer this question.

Browse other questions tagged .