Skip to main content

Questions tagged [information-geometry]

Information theory is a study of probability and statistics using the techniques of differential geometry. The area covers statistical model, Fisher metric, $\alpha$ connection, canonical divergence etc.

0 votes
0 answers
8 views

Parametric and non parametric probability distributions

Non-parametric Statistical Models on Finite and Infinite Measure Spaces Consider the following sets of probability densities: In Amari it says Consider a family $S$ of probability distributions on $X$...
Andyale's user avatar
  • 181
0 votes
0 answers
25 views

Non-parametric and parametric statistical manifolds: Equivalence of score functions in tangent spaces [closed]

Below is the framework to give the manifold structure to the space $M_{\mu}=\{f \in L^{1}(\mu): f>0 , \mu a.e , \int f d\mu=1\}$ Statistical Model and its Topology: The Statistical Model and ...
Andyale's user avatar
  • 181
4 votes
1 answer
88 views

Statistical Model and its Topology

Consider the following setup: Let $(\Omega, \mathcal{B}, \mu)$ be a probability space and let us denote $\mathcal{M}_{\mu}$ the set of all densities of all probability measures equivalent to $\mu$: $$ ...
Andyale's user avatar
  • 181
1 vote
1 answer
39 views

Question about the Fisher information metric

suppose we have the two one dimensional gaussian probability distribution functions $f(x)$ and $g(x)$ with parameters $P=(\mu_1,\sigma_1)$ and $Q=(\mu_2,\sigma_2)$ so we know that we can give the ...
Andyale's user avatar
  • 181
0 votes
0 answers
28 views

Reference Hellinger distance as a geodesic distance

I consider a statistical manifold equipped with the Fisher Information Metric. I want to show that for the exponential family (with no additional constraint), the Hellinger distance coincides with the ...
Ramufasa's user avatar
0 votes
0 answers
35 views

Maximizing the expectation in CE Importance sampling.

Suppose the following maximization: $$v_t = \arg \max_{v} E_{v_{t-1}} 1\{S(x) \geq \gamma\} \frac{f(x;u)}{f(x;v_{t-1})}\ln f(x;v) = max_{v} E_{v_{t-1}} 1\{S(x) \geq \gamma\} W(u;v_{t-1}) \ln f(x;v),$$ ...
entropy's user avatar
  • 147
1 vote
0 answers
47 views

Smoothness of the Fréchet Function on Riemannian Manifolds

Suppose $M$ is a compact Riemannian manifold and let $d$ be the induced distance function on $M$. Let $\mu$ be a probability measure on $M$ with continuous density. The Fr$\acute{\mathrm{e}}$chet ...
Yueqi's user avatar
  • 11
0 votes
1 answer
44 views

Links between sufficient statistics and Chentsov's characterization of Fisher metric

I've been self-studying "Information Geometry" by Ay et al. fascinated by the connection between Geometry, Probability and even Statistics. The proofs are clear to me, nonetheless, even in ...
ehceb's user avatar
  • 59
2 votes
0 answers
35 views

Strict linear independence of probability measures

Let $X$ be a measurable space and $P_1, \dotsc, P_K$ be probability measures on it. If $\mu$ is a reference $\sigma$-finite measure with respect to which all $P_k$ are absolutely continuous, this ...
Paweł Czyż's user avatar
  • 3,320
0 votes
0 answers
24 views

Representation theory for symmetries of probability distribution functions

I would like to parameterize all the possible modifications to a probability density function. Is there a representation theory for this? Something along the lines of, these are all the linear ...
Alex's user avatar
  • 161
0 votes
0 answers
48 views

q-Gaussian distribution pdf. Normalization, mean and covariance.

I have been trying to understand $q$-exponential family of distributions. In this paper, it has the definition of $q$-logarithm, $q$-exponential and $q$-Gaussian as follows. $$\log_{q}(u) = \frac{1}{1-...
user1168149's user avatar
1 vote
0 answers
28 views

KL Divergence on the Sphere

Given two empirical measures $\mu$ and $\nu$, restricted to the $(n-1)$-dimensional unit sphere $\mathcal{S}^{n-1}$, is there a closed form expression for the (discrete / sample) KL divergence between ...
Tolga Birdal's user avatar
0 votes
0 answers
64 views

Second order derivative of a contrast function gives Reimanian metric:

Let $M$ be a $C^{\infty}$ manifold. We can define a contrast function $\rho:M\times M\to \mathbb{R}$ satisfies $\rho(x,y)\geq0$ for all $x,y\in M$ and equality holds iff $x=y$. In Eguchi's 1991 paper &...
Lily's user avatar
  • 1
0 votes
0 answers
49 views

Computational Formula for Geodesic Distance on a Statistical Manifold

I am working on a problem where I need to compute the geodesic distance between two points on a statistical manifold. suppose we have two datasets $X$ and $Y$, I map them to PDF using the Kernel ...
Andyale's user avatar
  • 181
0 votes
0 answers
108 views

Intuition of Wasserstein and Information geometry geodesics

Two important geometries that can be given to the space of multivariate Gaussian distributions are given by the Wasserstein distance and by the Fisher metric (ie. Information geometry). Although there'...
dherrera's user avatar
  • 160

15 30 50 per page
1
2 3 4 5
10