Search Results
Search type | Search syntax |
---|---|
Tags | [tag] |
Exact | "words here" |
Author |
user:1234 user:me (yours) |
Score |
score:3 (3+) score:0 (none) |
Answers |
answers:3 (3+) answers:0 (none) isaccepted:yes hasaccepted:no inquestion:1234 |
Views | views:250 |
Code | code:"if (foo != bar)" |
Sections |
title:apples body:"apples oranges" |
URL | url:"*.example.com" |
Saves | in:saves |
Status |
closed:yes duplicate:no migrated:no wiki:no |
Types |
is:question is:answer |
Exclude |
-[tag] -apples |
For more details on advanced search visit our help page |
Results tagged with maximum-likelihood
Search options questions only
not deleted
user 288217
For questions that use the method of maximum likelihood for estimating the parameters of a statistical model with given data.
2
votes
3
answers
2k
views
Maximum likelihood estimator for uniform distribution $U(-\theta, 0)$
Consider $X_1,X_2,...,X_n$ i.i.d $U(-\theta,0)$.
I want to find the maximum likelihood estimator of $\theta$.
I know that $f(x,\theta)=\frac{1}{\theta}$ for $-\theta < x < 0$ and that $L_n(\theta, x …
0
votes
1
answer
30
views
Show that the MLE divided by the true value converges to 1
Let $S \sim Binomial(N,\theta)$, where $S$ can be seen as $S_n = B_1+...+B_N$ and $B_i$'s are i.d.d $Bernoulli(\theta)$.
I have to show that $\frac{\hat{N}_{MLE}}{N}$ converges to $1$ in probability …
2
votes
2
answers
2k
views
Minimum mean squared error of uniform distribution
Let $X_1,X_2,\ldots,X_n$ be i.i.d $\operatorname{Uniform}(-\theta,0)$.
Now consider all of the estimates of the form $S_\rho=\rho \hat{\theta}_\text{MLE}$.
I have to find which of these estimates has …
0
votes
1
answer
230
views
Finding joint likelihood function for linear regression
Let $Y_i=\alpha_0+\beta_0 X_i + \epsilon_0$, where $\epsilon_i \sim N(0, \sigma_0^2)$ and $X_i \sim N(\mu_x,\tau_0^2)$ are independent.
The data $(X_i, Y_i)$ are generated from $Y_i=\alpha_0+\beta_0 …
1
vote
1
answer
521
views
Show that the MLE converges to the true value (consistency)
Let $Y_i=\alpha_0+\beta_0 X_i + \epsilon_0$, where $\epsilon_i \sim N(0, \sigma_0^2)$ and $X_i \sim N(\mu_x,\tau_0^2)$ are independent.
The data $(X_i, Y_i)$ are generated from $Y_i=\alpha_0+\beta_0 …
2
votes
2
answers
879
views
Maximum likelihood estimator of $\operatorname{Poisson}(\lambda)$ with restricted $\lambda$
Consider $X_1, X_2, \ldots,X_n$ iid $\operatorname{Poisson}(\lambda)$ random variables, where $\lambda \in [a,b]$, $0<a<b$.
How do you find the maximum likelihood estimator of the restricted $\lambda …