Skip to main content

All Questions

1 vote
0 answers
34 views

I don't think this is conditional dependence, so what is it?

I am looking for the name of the following phenomenon. There are three random variables, $X,Y,Z$. We have $P(X,Y) \neq P(X)P(Y)$ and $P(Y,Z) \neq P(Y)P(Z)$. In other words, $X$ and $Y$ are dependent, ...
Wapiti's user avatar
  • 111
5 votes
1 answer
104 views

$E[(X+Y)^{a}] > E[(X)^{a}]$?

Assume I have two strictly positive i.i.d. random variables, $X$ and $Y$. Under what conditions is the following inequality true? $$E[(X+Y)^{a}] > E[(X)^{a}], \hspace{2mm} a \in (0,1)$$ Should have ...
econ_ugrad's user avatar
0 votes
0 answers
47 views

Expectation of $u^\top(u+Ax)$, when $A$ and $u$ are nonlinear functions of $x$

Let $x\in\mathbb R^d$, and $s=\operatorname{softmax}(x)$. Let $y$ be a fixed one-hot vector such that $$u = s-y \\ v =(\operatorname{diag}(s) - ss^\top)x$$ I am interested in the inequality $u^\top (u ...
Phoenix's user avatar
  • 101
2 votes
1 answer
132 views

Expected value of the product of three random variables

For two dependent random variables we have: $$Cov[X, Y] = E[XY] - E[X]E[Y]$$ So that $E[XY] = E[X]E[Y] + Cov[X, Y]$ In case of three arbitrarily correlated random variables $(X, Y, Z)$, is it possible ...
Stefano Lombardi's user avatar
0 votes
0 answers
43 views

How to calculate the expectancy of the ratio of non-independent random variables?

How can I calculate this expectancy: $$ E \left [ \frac{\sum_{t=1}^T{Z_tX_t}}{\sum_{t=1}^T{Z_t^2}} \right ] $$ where $Z_t \sim N(0,1)$ and $X_t \sim N(0,1)$ are independent? Any tricks? Is it ...
PaulG's user avatar
  • 1,297
1 vote
1 answer
80 views

properties of a expectation for a non-negative random variable

Say I have a non-negative discrete random variable $X$ (values of $X$ can be mapped to integers $(0, 2^n -1)$ for $n \in \mathbb{Z}$) and an associated distribution $P(X)$. Given a non-negative scalar ...
Manas Sajjan's user avatar
1 vote
2 answers
107 views

The training error of best hypothesis

Let $\mathcal{X}$ and $\mathcal{Y}$ denote the domain set and label set respectively. Also let $\mathcal{D}$ be a distribution over $\mathcal{X}$ and $f:\mathcal{X} \to \mathcal{Y}$ be the true ...
S.H.W's user avatar
  • 67
1 vote
0 answers
57 views

Joint density of two functions of a uniformly distributed random variable

I'd like to work out $\operatorname{Cov}(\cos(2U), \cos(3U))$ where $U$ is uniformly distributed on $[0, \pi]$. I believe this involves computing $\mathbb{E}[\cos(2U)\cos(3U)]$. If so, then I first ...
johnsmith's user avatar
  • 345
0 votes
0 answers
47 views

Order of centroids across two independent random variables

Say I have a function $q_{D, \Theta}: \mathcal{X} \to \mathcal{Y}$ that depends on independent random variables $D$ and $\Theta$. I want to consider "centroids" of $q$ with respect to ...
ngmir's user avatar
  • 339
3 votes
1 answer
119 views

Expectation of product of sample averages

I have a bunch of iid random variables $X_i\sim q$ and I have defined other random variables $A_i = a(X_i)$ and $B_i = b(X_i)$. Then I bumped into the following expression $$ \begin{align} \mathbb{E}\...
Euler_Salter's user avatar
  • 2,236
1 vote
1 answer
105 views

Expectation of the reciprocal of a standard normal random variable [duplicate]

If $\mathbf{X} \sim_{iid} \mathcal{N}(\mu, 1)$ then we know that the sample mean $\bar{X} \sim \mathcal{N}(\mu, 1/n)$, how would we show that $$\mathbf{E}\left(\frac{1}{\bar{X}}\right) = \infty $$ and ...
delta_99's user avatar
1 vote
1 answer
164 views

Show that for random variable $X$ with $N = \{1, 2, \ldots \}$, $E(X) = \sum_{n = 1}^\infty P(X \geq n)$ [duplicate]

Prove that for random variable with natural numbers from 1 to infinity the expected value $E(X)$ is equal to $\sum_{n = 1}^\infty P(X \geq n)$. Is this the mathematically correct way to prove it? And ...
Ste0l's user avatar
  • 45
0 votes
1 answer
113 views

"Almost surely" used in an expectation

Let $(\mathsf{X}, \mathcal{X})$ be a measurable space, $\pi(dx)$ be a probability measure on it, and $K:X\times\mathcal{X}\to[0, 1]$ be a Markov kernel. I have the following property $$ \int K(x, A) \...
Physics_Student's user avatar
1 vote
2 answers
228 views

Understand the linearity of the expected value operator

Let Z = X + Y. The linearity of $\mathsf E$ implies that $\mathsf E [Z] = \mathsf E [X] + \mathsf E [Y]$. The left-hand side should be $\int (x+y)p_{x+y}(x+y)d(x+y)$. The right-hand side should be $\...
Dave Ray's user avatar
3 votes
1 answer
134 views

Why do we weight the surprisals by the probabilities when computing the entropy?

Let's say that a random variable $X$ takes the values $(x_1, x_2, \dots, x_n)$ over the sample space $\Omega$. As far as my understanding goes, the entropy of a given variable is meant to give an ...
Mehdi Charife's user avatar

15 30 50 per page
1
2 3 4 5
17