6
$\begingroup$

Suppose I have a number of samples drawn from a normal distribution $x_i \sim \mathcal{N}(\mu,C)$ with $i = 1 \dots n$. I can make observations $z_i = x_i + e_i$ for those samples which are perturbed by noise $e_i$ of known characteristic $e_i \sim \mathcal{N}(0,\Sigma_i)$. Note that the distribution of the observation noise is different for each $z_i$.

I would like to estimate $\mu$ and $C$ given the $z_i$.

Since $e_i$ is unbiased $\mu$ should just be $\mathbb{E}[z_i]$. In case $\Sigma_i = D$ for all $i$, $C = \operatorname{Var}(z_i) - D$ assuming independence between the noise. Would it be safe to say that $C = \operatorname{Var}(z_i) - \mathbb{E}[\Sigma_i]$ in the general case?

$\endgroup$
6
  • 1
    $\begingroup$ (1) What does $\mathbb{E}[\Sigma_i]$ mean when it is stipulated that $\Sigma_i$ is a "known characteristic"? Are you suggesting there is some prior distribution involved? (2) Because there is no guarantee that $\text{Var}(z_i) - \mathbb{E}[\Sigma_i]$ is positive definite, you might want to worry a bit about the potential inadmissibility or unrealism of your estimator. $\endgroup$
    – whuber
    Commented Feb 19, 2013 at 21:28
  • $\begingroup$ (1) $\mathbb{E}[\Sigma_i] = 1/n \sum_{i=1..n} \Sigma_i$ (2) The same would be true for when $\Sigma_i = D$ for all $i$ as well. How do I get around this problem? And yes, I worry about the admissibility of the estimator, hence my question :) $\endgroup$
    – Jakob
    Commented Feb 19, 2013 at 21:51
  • 1
    $\begingroup$ If $\Sigma_i$ is known, and $\Sigma_i = L_i L_i'$, if you let $x_i^* = L_i^{-1} x$ (and similarly for $z$ and $e$), then depending on how you tend to look at things that may help a bit. $\endgroup$
    – Glen_b
    Commented Feb 19, 2013 at 22:06
  • 1
    $\begingroup$ Jakob, are the $x_i$ numbers or vectors? Your use of capital $\Sigma$ for the variance suggests you are considering them vectors but your edit makes sense only if they are numbers. $\endgroup$
    – whuber
    Commented Feb 20, 2013 at 16:43
  • 1
    $\begingroup$ @whuber Ah sorry. Currently I am happy with $x_i$ being univariate, but I think I might have to extend it to the multivariate case later. I did an update to the notation to make it more clear, but reverted it again since then its out of line with the comments. $\endgroup$
    – Jakob
    Commented Feb 20, 2013 at 19:46

1 Answer 1

2
$\begingroup$

I think I've figured it out now. As stated in the comments, I think there is indeed a problem with the estimators. I found another answer on using a linear estimator for a similar problem. If I didn't make any mistakes, the estimator for $\mu$ in my case should be $$\hat{\mu} = \frac{ \sum_{i=1..n} z_i/(C+\Sigma_i)}{ \sum_{i=1..n} 1/(C+\Sigma_i)}$$ Since I don't actually know $C$, I guess I could use the estimate $\hat{C}$ based on the samples. Based on the Wikipedia Article on weighted mean and then subtracting the covariance influence from the $\Sigma_i$, I can estimate the covariance as $$\hat{C} = \left(\frac{\sum w_i}{(\sum w_i)^2 - \sum w_i^2} \sum w_i(z_i - \hat{\mu}) \right) - \frac{n}{\sum w_i}$$ where $w_i = 1/\Sigma_i$.

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.