3
$\begingroup$

Given a sample of size $n$ from $Ber(\theta)$ distribution, where $0\leq\theta\leq\frac{1}{2}$, we need to find the maximum likelihood estimator for $\theta$.

Now, the solution says $\hat{\theta}=min\{\bar{x},\frac{1}{2}\}$.

But I beg to differ, and I'll try to explain why below, please correct me if I am wrong !

The likelihood function goes as follows :

$L(x_1,x_2,....,x_n;\theta)=\theta^{\sum x_i}(1-\theta)^{n-\sum{x_i}}$

Take $log$ both sides :

$log(L(x_1,x_2,....,x_n;\theta))=\sum x_i log(\theta)+(n-\sum{x_i})log(1-\theta)$

So we kinda want to maximize this function for $\theta$,

Let $h(\theta)=\sum x_i log(\theta)+(n-\sum{x_i})log(1-\theta)$

Differentiate w.r.t $\theta$ :

$h'(\theta)=\dfrac{\sum x_i}{\theta}-\dfrac{(n-\sum{x_i})}{(1-\theta)}=\dfrac{\sum x_i}{\theta(1-\theta)}-\dfrac{n}{(1-\theta)}=\dfrac{1}{1-\theta}(\dfrac{\sum x_i}{\theta}-n)=\dfrac{n}{1-\theta}(\dfrac{\bar{x}}{\theta}-1)$

So, the function increases when $h'(\theta)>0$, i.e $(\dfrac{\bar{x}}{\theta}-1)>0$ or $\theta<\bar{x}$, so, in case of the likelihood function increasing, $\hat{\theta}=max(\frac{1}{2},\bar{x})$ maximizes it.

Whereas, when $h'(\theta)<1$, we have $\theta>\bar{x}$ and in case of such a decreasing function, the function is maximized at the lowest value of $\theta$, which is $\bar{x}$, hence $\hat{\theta}=\bar{x}$.

I don't really know where I went wrong here, can anyone help ?

$\endgroup$
1

1 Answer 1

3
$\begingroup$

You are tantalisingly close to the answer, but first, you should notice that your proposed solution allows an MLE outside the range of your parameters constraint --- you naughty boy!

You have already figured out that the log-likelihood $\ell(\theta)$ is increasing for $0 \leqslant \theta < \bar{x}$ and then decreasing for $\bar{x} < \theta \leqslant 1$. This means that when you look at the function over the whole range $0 \leqslant \theta \leqslant 1$ you see that it is maximised at $\theta = \bar{x}$ and decreases on either side of this.

Now, let's impose your parameter constraint $0 \leqslant \theta \leqslant \tfrac{1}{2}$. If $\bar{x} \leqslant \tfrac{1}{2}$ then the global maximising value (in the unconstrained problem) is in the parameter range and so you have MLE $\hat{\theta} = \hat{x}$. If $\bar{x} > \tfrac{1}{2}$ then the log-likelihood function is strictly increasing over the parameter range, but it never gets to the global maximising value in the unconstrained problem. In this case the maximum occurs at the right-boundary, so you have MLE $\hat{\theta} = \tfrac{1}{2}$. Putting these together you have:

$$\hat{\theta} = \begin{cases} \bar{x} & \text{if }\bar{x} \leqslant \tfrac{1}{2} \\ \tfrac{1}{2} & \text{if }\bar{x} > \tfrac{1}{2} \end{cases} = \min (\bar{x}, \tfrac{1}{2}).$$

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.