All Questions
116
questions
0
votes
1
answer
34
views
Tailsum Formula and Indicator Functions
In my probability theory class we proved that $$\mathbb{E}[x]=\int_0^\infty \mathbb{P}(X>t) dt,$$ where $X\geq0$ is a non-negative random variable and $\mathbb{E}[X]:= \int_\Omega X(\omega) d\...
1
vote
0
answers
39
views
Is every probability mass function $f_X$ of a random variable $X$ the Radon-Nikodym derivative of $X_*P$ with respect to the counting measure?
Let $X$ be a discrete random variable (meaning $\text{im}(X)$ is countable) from a probability triple $(A,\mathcal{A},P)$ to a measurable space $(\mathbb{R},\mathcal{B})$ where $\mathcal{B}$ is the ...
0
votes
0
answers
43
views
Is it possible to define $L^p$ spaces using a non-sigma-finite measure space and a Banach space?
Most often (at least in probability), one defines the $L^p$ space as
Let $(\Omega, \mathcal{F}, \mathbb{P})$ be a probability space and let $p\geq 1$ be a real number. Then
$$
L^p(\Omega, \mathcal{F},...
0
votes
1
answer
156
views
Defining the expectation of a measurable function with respect to a (non-probability) measure
The typical definition of expectation requires a probability space and a random variable
Let $(\Omega, \mathcal{F}, \mathbb{P})$ be a probability space, $(\mathsf{X}, \mathcal{X})$ be a measurable ...
2
votes
1
answer
125
views
Reconciling uniform integrability definitions and their relation to tightness
Take a family $F$ of measurable functions on $\mathbb{R}$, then $F$ is uniformly integrable if:
Definition 1 (Royden): For each $\epsilon >0$, there is a $\delta>0$ such that for each $f\in F$, ...
0
votes
1
answer
37
views
A question on Monge formula- Optimal Transport
I have started reading optimal transport from the book "Optimal Transport
for Applied Mathematicians" and I have a question regarding change of variables in Monge's formulation.
We can ...
3
votes
2
answers
129
views
Does the collection of bounded continuous functions characterize probability law?
Let $X,Y :(\Omega,\mathcal{A},\mathbb{P}) \to \mathbb{R}$ be two r.v.'s defined on a probability space $\Omega$, and $C_b(\mathbb{R})$ the collection of all real bounded continuous function. I'm ...
0
votes
1
answer
92
views
Integrating with respecting to a probability measure defined by $P=\lambda\cdot f$
Let $(\mathbb{R},\mathcal{B})$ and $P=\lambda\cdot f$, where $f:\mathbb{R}\to\mathbb{R}$, $f(x)=1_{(0,1)}(x)$, $x\in \mathbb{R}$ is the uniform density. Let $X,Y$ be
two real random variables defined ...
2
votes
1
answer
383
views
Functions that are integrable with respect to all probability measures [duplicate]
I am interested to know which measurable functions are integrable with respect to all probability measures, i.e., all $f$, for which:
$$ \int \| f \| ~{\rm d}\mathbb{P} < \infty,$$
where $\mathbb{...
-2
votes
2
answers
272
views
Examples and clarification about the integration with respect to a measure
I'm having hard times in understanding what does it really mean when they say "integration with respect to a measure$. I mean a writing like
$$\int f\ \text{d}\mu$$
These are my doubts:
What is $...
0
votes
1
answer
52
views
$\left|\int^{T \theta}_0 \frac{\sin x}{x} dx\right|$ is bounded above by $\sup_y \int^y_0 \frac{\sin x}{x}dx$
Basically, we need to prove $\left|\int^{T \theta}_0 \frac{\sin x}{x} dx\right|$ is bounded above by $\sup_y \int^y_0 \frac{\sin x}{x}dx < \infty$. This to utilize the dominated convergence theorem ...
1
vote
0
answers
19
views
On the sum of i.i.d. $(g_n)_{n=1}^\infty$
Let $(\Omega, \mathcal{A}, P)$ be a probability space.
Let $(g_n)_{n=1}^\infty$ be i.i.d. And let $g_n:\Omega \to \mathbb{R}$ be a random variable that follows the standard normal distribution.
It is ...
0
votes
0
answers
90
views
What exactly is $dy$ in $\mu(dy)$ when we do integration in measure theory? Question about Markov kernel
I'm reading this book on Markov chains: http://members.unine.ch/michel.benaim/perso/MarkovbookFinal120421.pdf
and on page 13, we consider a separable metric space $M$ and its Borel sigma algebra $\...
2
votes
1
answer
42
views
Example of a converging sequence of random variables such that $E[X_n] \rightarrow E[X]$ but not $E[|X_n|] \rightarrow E[|X|]$
I have to find a sequence of random variables $(X_n)_n$ which converges in probability to $X$, such that $E[X_n] \rightarrow E[X]$ but $E[|X_n|]\not\rightarrow E[|X|]$.
I'm honestly quite lost on ...
0
votes
0
answers
89
views
Change of coordinates in Shannon entropy
Let's say I've a probability distribution $f(x)$ for the variable $x$. With a change of variable to $y$, I can find the probability distribution for $y$: $g(y) = f(x)\left|\frac{dx}{dy}\right|$.
The ...