1
$\begingroup$

I was trying to resolve this exercise:

enter image description here

This exercise is from the book "Statistical Inference, Second Edition" by Casella and Berger. Checking the solutions manual, I was understanding the solution but I can't figure out why the operation that is hightlighted in yelow it's equal to one. My possition is that the value hightlighted depends on the sample, but the solution is hardly wrong. Can someone explain that part?. Here's the solution:

enter image description here

$\endgroup$
2
  • 1
    $\begingroup$ You don't really need all three of maximum-likelihood, likelihood and likelihood-ratio as tags. Since this is routine bookwork I'd consider replacing at least one of them with self-study (q.v.) $\endgroup$
    – Glen_b
    Commented Mar 5, 2015 at 12:07
  • $\begingroup$ I didn't know about the tag self-study and its usefulness. I've just notice that. Thanks for the information. $\endgroup$
    – CreamStat
    Commented Mar 5, 2015 at 16:49

1 Answer 1

3
$\begingroup$

Set everything into an exponential:

\begin{align*} \left(\prod_{i=1}^nx_i\right)^{\hat\theta_0-\hat\mu}\left(\prod_{i=1}^my_i\right)^{\hat\theta_0-\hat\theta}&=\exp\left\{ (\hat\theta_0-\hat\mu)\sum_i\log(x_i)+(\hat\theta_0-\hat\theta)\sum_i\log(y_i)\right\}\\ &=\exp\left\{ \dfrac{-(n+m)\sum_i\log(x_i)}{\sum_i\log(x_i)+\sum_i\log(y_i)}+n+\dfrac{-(n+m)\sum_i\log(y_i)}{\sum_i\log(x_i)+\sum_i\log(y_i)}+m\right\}\\ &=\exp\left\{ \dfrac{-(n+m)\left[\sum_i\log(x_i)+\sum_i\log(y_i)\right]}{\sum_i\log(x_i)+\sum_i\log(y_i)}+n+m\right\}\\ &=\exp\left\{-(n+m)+n+m\right\}\\ &=1 \end{align*}

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.