I am learning about different types of functions based on Probability Distributions.
Here is my summary on the most common ones:
Probability Distribution Function to Cumulative Distribution Function: $$ F(x) = \int_{-\infty}^{x} f(t) dt \quad \text{(CDF from PDF)} $$ $$ f(x) = \frac{d}{dx}F(x) \quad \text{(PDF from CDF)} $$
Probability Distribution Function to Moment Generating Function (Laplace and Inverse Laplace): $$ M(t) = E[e^{tX}] = \int_{-\infty}^{\infty} e^{tx} f(x) dx \quad \text{(MGF from PDF)} $$ $$ f(x) = \frac{1}{2\pi i} \int_{c-i\infty}^{c+i\infty} e^{sx} M(s) ds \quad \text{(PDF from MGF via Inverse Laplace)} $$
$$ F(s) = \int_{0}^{\infty} e^{-st} f(t) dt \quad \text{(Laplace Transform)} $$ $$ f(t) = \frac{1}{2\pi i} \int_{c-i\infty}^{c+i\infty} e^{st} F(s) ds \quad \text{(Inverse Laplace Transform)} $$
Probability Distribution Function and Characteristic Function (Fourier and Inverse Fourier) $$ \phi(t) = E[e^{itX}] = \int_{-\infty}^{\infty} e^{itx} f(x) dx \quad \text{(Characteristic Function from PDF)} $$ $$ f(x) = \frac{1}{2\pi} \int_{-\infty}^{\infty} e^{-itx} \phi(t) dt \quad \text{(PDF from Characteristic Function via Inverse Fourier)} $$
$$ F(ω) = \int_{-\infty}^{\infty} e^{-iωt} f(t) dt \quad \text{(Fourier Transform)} $$ $$ f(t) = \frac{1}{2\pi} \int_{-\infty}^{\infty} e^{iωt} F(ω) dω \quad \text{(Inverse Fourier Transform)} $$
This leads me to my question: Suppose we have a Probability Mass Function for a Discrete Random Variable - here is how we define the Probability Generator Function (PGF):
$$ G_X(s) = E(s^X) = \sum_{x=0}^{\infty} s^x P(X = x) $$
Now, suppose we only have the Probability Generator Function - is it possible to find out the original Probability Mass Function? I have seen this formula used before:
$$ P(X=k) = \frac{1}{k!} \frac{d^k G_X(s)}{ds^k} \Bigg|_{s=0} $$
But I am not sure if it is correct...
UPDATE:
Note 1: Based on Henry's comments, I tried to follow the logic:
The PGF $ G_X(s) $ of a discrete random variable $ X $ is defined as: $$ G_X(s) = \mathbb{E}[s^X] = \sum_{k=0}^{\infty} P(X = k) s^k $$
Let's assume that this formula is correct: $$ P(X = k) = \frac{1}{k!} \frac{d^k G_X(s)}{ds^k} \Bigg|_{s=0} $$
$G_X(s) = \sum_{k=0}^{\infty} P(X = k) s^k$ is a power series where the coefficient of $ s^k $ is $ P(X = k) $.
The $ k $-th derivative of $ G_X(s) $ with respect to $ s $ is: $$ \frac{d^k G_X(s)}{ds^k} = \frac{d^k}{ds^k} \left( \sum_{k=0}^{\infty} P(X = k) s^k \right) $$
Evaluating the $ k $-th derivative at $ s = 0 $, it looks like the terms involving higher powers of $s$ will vanish: $$\frac{d^k G_X(s)}{ds^k} \Bigg|_{s=0} = k! P(X = k)$$
In the end, dividing by $ k! $ isolates $ P(X = k) $: $$ P(X = k) = \frac{1}{k!} \frac{d^k G_X(s)}{ds^k} \Bigg|_{s=0} $$
Note 2: I tried to make an example to verify this logic.
Suppose $ X $ follows a Poisson distribution with parameter $ \lambda $. The PDF and PGF for a Poisson random variable $ X $ is:
$$ P(X = k) = \frac{\lambda^k e^{-\lambda}}{k!} $$ $$ G_X(s) = e^{\lambda(s-1)} $$
Take the $ k $-th derivative: $$ \frac{d^k}{ds^k} G_X(s) = \frac{d^k}{ds^k} e^{\lambda(s-1)} = \lambda^k e^{\lambda(s-1)} $$
Evaluate at $ s = 0 $: $$ \frac{d^k G_X(s)}{ds^k} \Bigg|_{s=0} = \lambda^k e^{-\lambda} $$
Divide by $ k!$ for isolation: $$ P(X = k) = \frac{1}{k!} \lambda^k e^{-\lambda} $$
This gives us back the PMF of a Poisson distribution: $$ P(X = k) = \frac{\lambda^k e^{-\lambda}}{k!} $$