Given a sample of size $n$ from $Ber(\theta)$ distribution, where $0\leq\theta\leq\frac{1}{2}$, we need to find the maximum likelihood estimator for $\theta$.
Now, the solution says $\hat{\theta}=min\{\bar{x},\frac{1}{2}\}$.
But I beg to differ, and I'll try to explain why below, please correct me if I am wrong !
The likelihood function goes as follows :
$L(x_1,x_2,....,x_n;\theta)=\theta^{\sum x_i}(1-\theta)^{n-\sum{x_i}}$
Take $log$ both sides :
$log(L(x_1,x_2,....,x_n;\theta))=\sum x_i log(\theta)+(n-\sum{x_i})log(1-\theta)$
So we kinda want to maximize this function for $\theta$,
Let $h(\theta)=\sum x_i log(\theta)+(n-\sum{x_i})log(1-\theta)$
Differentiate w.r.t $\theta$ :
$h'(\theta)=\dfrac{\sum x_i}{\theta}-\dfrac{(n-\sum{x_i})}{(1-\theta)}=\dfrac{\sum x_i}{\theta(1-\theta)}-\dfrac{n}{(1-\theta)}=\dfrac{1}{1-\theta}(\dfrac{\sum x_i}{\theta}-n)=\dfrac{n}{1-\theta}(\dfrac{\bar{x}}{\theta}-1)$
So, the function increases when $h'(\theta)>0$, i.e $(\dfrac{\bar{x}}{\theta}-1)>0$ or $\theta<\bar{x}$, so, in case of the likelihood function increasing, $\hat{\theta}=max(\frac{1}{2},\bar{x})$ maximizes it.
Whereas, when $h'(\theta)<1$, we have $\theta>\bar{x}$ and in case of such a decreasing function, the function is maximized at the lowest value of $\theta$, which is $\bar{x}$, hence $\hat{\theta}=\bar{x}$.
I don't really know where I went wrong here, can anyone help ?