Skip to main content

All Questions

1 vote
0 answers
57 views

Joint density of two functions of a uniformly distributed random variable

I'd like to work out $\operatorname{Cov}(\cos(2U), \cos(3U))$ where $U$ is uniformly distributed on $[0, \pi]$. I believe this involves computing $\mathbb{E}[\cos(2U)\cos(3U)]$. If so, then I first ...
johnsmith's user avatar
  • 345
7 votes
2 answers
799 views

Is the expectation of a random vector multiplied by its transpose equal to the product of the expectation of the vector and that of the transpose

I'm taking a course in advance statistics and we have to prove whether the following expression is true: $E[zz^T]=E[z]E[z^T]$. I am assuming it is not, since the formula of the covariante matrix is $...
ghost wizard's user avatar
8 votes
2 answers
182 views

Is there anything interesting to be taken from the fact that $E[(X-E[X])(Y-E[X])] = E[(X-E[X])(Y-E[Y])]$?

While playing around with the formula for covariance, I discovered something I wasn't expecting. Replacing the $E[Y]$ in the definition of covariance with an $E[X]$ appears to simplify back down to ...
amonaether's user avatar
1 vote
1 answer
94 views

van der Vaart Asymptotic Statistics, page 38, why does $e_\theta'=\operatorname{Cov}_{\theta}t(X)$?

On Page 38 of van der Vaart's Asymptotic Statistics (near the bottom of the page), it says By differentiating $E_\theta t(X)$ under the expectation sign (which is justified by the lemma), we see that ...
ExcitedSnail's user avatar
  • 2,966
1 vote
1 answer
49 views

Variance of $X + \alpha^\top Y$ where $X$ is a scalar random variable and $Y$ is a random vector [duplicate]

Consider a scalar random variable $X\in\mathbb{R}$, a vector random variable $Y\in\mathbb{R}^n$ and a constant (non-random) vector $\alpha\in\mathbb{R}^n$. I want to compute $$ \mathbb{V}[X + \alpha^\...
Physics_Student's user avatar
1 vote
1 answer
38 views

Prove covariance between sufficient statistic and logarithm of base measure in exponential family is equal to zero

Exponential family form is $$f_X(x) = h(x)\exp(\eta(\theta)\cdot T(x) - A(\theta))$$ I know $$\operatorname{Cov}(T(x), \log(h(x)) = 0.$$ But how can I prove it?
user388375's user avatar
0 votes
1 answer
32 views

Joint Hypothesis Testing-Variance

Got the following question: Here is the provided answer: I am confused about where the $9$ coefficient is coming from above. Any thoughts?
user384212's user avatar
3 votes
1 answer
127 views

Sign of Correlation between $X$ and $f(X)$ for strictly monotonic $f$

This question is a follow up to this question. Suppose $f$ is strictly increasing. Can we say $$\text{Cov}(X,f(X))\geq 0?$$ Ben's answer on the aforementioned linked post can be extended to show the ...
Golden_Ratio's user avatar
4 votes
1 answer
143 views

Sign of Correlation between $X$ and $\log X$

Suppose $\text{supp}(X)\subseteq \mathbb{R}_{\geq 1}.$ Can we say $$\text{Cov}(X,\log X)\geq 0?$$ On one hand, we can say by monotonicity of log and Jensen's inequality that $$X\geq E[X]\implies \log ...
Golden_Ratio's user avatar
1 vote
1 answer
161 views

Calculate expectation of a function with two dependent random variables

Hi Cross Validated community, My question has to do regarding expectation of a multiplication of two random variables that are dependent. Assume there are two random variables, one discrete: $G \in \{...
dev85's user avatar
  • 13
2 votes
1 answer
62 views

Expectation given pairwise covariances

I have 4 variables A,B,C,D over {-1,1} (Rademacher variables) and know that ...
Maruney's user avatar
  • 21
-1 votes
1 answer
53 views

$cov(X,f(X))\neq 0$ and $E(X f(X))\neq 0$

Take a random variable $X$. Is it true that (1) $cov(X,f(X))\neq 0$ for any function $f$? (2) $E(X f(X))\neq 0$ for any function $f$? I believe the answer to both questions is no. However, can you ...
Star's user avatar
  • 889
1 vote
0 answers
25 views

mean and covarince matrix of AR(1) [closed]

assume I have a price data called pt, I fitted AR(1) model p_t= alpha + beta pt_1 + e_t , ...
A.F.R.S2022's user avatar
4 votes
1 answer
539 views

some thought about independence and orthogonal, please comment on this if it's wrong

It seems that linearly independent is totally different from independent of random variable concept. Non-zero vectors Orthogonality must imply linearly independence. In Statistics, the relation of ...
LJNG's user avatar
  • 331
0 votes
0 answers
45 views

Covariance of some random variables

I am given 2n-1 random variables, namely X₁, X₂... Xₙ, Xₙ₊₁... X₂ₙ₋₁. I also have E(Xᵢ)=𝜇 and Var(Xᵢ)=𝜎² for i=1,2,...2n-1. Suppose Y=X₁+X₂+...+Xₙ and W=Xₙ+Xₙ₊₁+...+X��ₙ₋₁ and I am asked to calculate ...
python noob's user avatar

15 30 50 per page
1
2 3 4 5