4
$\begingroup$

I am learning about different types of functions based on Probability Distributions.

Here is my summary on the most common ones:

  1. Probability Distribution Function to Cumulative Distribution Function: $$ F(x) = \int_{-\infty}^{x} f(t) dt \quad \text{(CDF from PDF)} $$ $$ f(x) = \frac{d}{dx}F(x) \quad \text{(PDF from CDF)} $$

  2. Probability Distribution Function to Moment Generating Function (Laplace and Inverse Laplace): $$ M(t) = E[e^{tX}] = \int_{-\infty}^{\infty} e^{tx} f(x) dx \quad \text{(MGF from PDF)} $$ $$ f(x) = \frac{1}{2\pi i} \int_{c-i\infty}^{c+i\infty} e^{sx} M(s) ds \quad \text{(PDF from MGF via Inverse Laplace)} $$

$$ F(s) = \int_{0}^{\infty} e^{-st} f(t) dt \quad \text{(Laplace Transform)} $$ $$ f(t) = \frac{1}{2\pi i} \int_{c-i\infty}^{c+i\infty} e^{st} F(s) ds \quad \text{(Inverse Laplace Transform)} $$

  1. Probability Distribution Function and Characteristic Function (Fourier and Inverse Fourier) $$ \phi(t) = E[e^{itX}] = \int_{-\infty}^{\infty} e^{itx} f(x) dx \quad \text{(Characteristic Function from PDF)} $$ $$ f(x) = \frac{1}{2\pi} \int_{-\infty}^{\infty} e^{-itx} \phi(t) dt \quad \text{(PDF from Characteristic Function via Inverse Fourier)} $$

    $$ F(ω) = \int_{-\infty}^{\infty} e^{-iωt} f(t) dt \quad \text{(Fourier Transform)} $$ $$ f(t) = \frac{1}{2\pi} \int_{-\infty}^{\infty} e^{iωt} F(ω) dω \quad \text{(Inverse Fourier Transform)} $$

This leads me to my question: Suppose we have a Probability Mass Function for a Discrete Random Variable - here is how we define the Probability Generator Function (PGF):

$$ G_X(s) = E(s^X) = \sum_{x=0}^{\infty} s^x P(X = x) $$

Now, suppose we only have the Probability Generator Function - is it possible to find out the original Probability Mass Function? I have seen this formula used before:

$$ P(X=k) = \frac{1}{k!} \frac{d^k G_X(s)}{ds^k} \Bigg|_{s=0} $$

But I am not sure if it is correct...

UPDATE:

Note 1: Based on Henry's comments, I tried to follow the logic:

  • The PGF $ G_X(s) $ of a discrete random variable $ X $ is defined as: $$ G_X(s) = \mathbb{E}[s^X] = \sum_{k=0}^{\infty} P(X = k) s^k $$

  • Let's assume that this formula is correct: $$ P(X = k) = \frac{1}{k!} \frac{d^k G_X(s)}{ds^k} \Bigg|_{s=0} $$

  • $G_X(s) = \sum_{k=0}^{\infty} P(X = k) s^k$ is a power series where the coefficient of $ s^k $ is $ P(X = k) $.

  • The $ k $-th derivative of $ G_X(s) $ with respect to $ s $ is: $$ \frac{d^k G_X(s)}{ds^k} = \frac{d^k}{ds^k} \left( \sum_{k=0}^{\infty} P(X = k) s^k \right) $$

  • Evaluating the $ k $-th derivative at $ s = 0 $, it looks like the terms involving higher powers of $s$ will vanish: $$\frac{d^k G_X(s)}{ds^k} \Bigg|_{s=0} = k! P(X = k)$$

  • In the end, dividing by $ k! $ isolates $ P(X = k) $: $$ P(X = k) = \frac{1}{k!} \frac{d^k G_X(s)}{ds^k} \Bigg|_{s=0} $$

  • Note 2: I tried to make an example to verify this logic.

  • Suppose $ X $ follows a Poisson distribution with parameter $ \lambda $. The PDF and PGF for a Poisson random variable $ X $ is:

$$ P(X = k) = \frac{\lambda^k e^{-\lambda}}{k!} $$ $$ G_X(s) = e^{\lambda(s-1)} $$

  • Take the $ k $-th derivative: $$ \frac{d^k}{ds^k} G_X(s) = \frac{d^k}{ds^k} e^{\lambda(s-1)} = \lambda^k e^{\lambda(s-1)} $$

  • Evaluate at $ s = 0 $: $$ \frac{d^k G_X(s)}{ds^k} \Bigg|_{s=0} = \lambda^k e^{-\lambda} $$

  • Divide by $ k!$ for isolation: $$ P(X = k) = \frac{1}{k!} \lambda^k e^{-\lambda} $$

  • This gives us back the PMF of a Poisson distribution: $$ P(X = k) = \frac{\lambda^k e^{-\lambda}}{k!} $$

$\endgroup$
3
  • $\begingroup$ Your last expression looks like a natural consequence of your penultimate expression, at least when the support of $X$ is (a subset of) the non-negative integers. All the other terms of $G_X(s)$ either are differentiated away, or become $0$ when evaluating at $s=0$. $\endgroup$
    – Henry
    Commented Jul 7 at 2:20
  • $\begingroup$ @ Henry: thank you so much for the hint ... I am going to try to do some manipulations... can you please show a few steps of the differentiation yourself? I am trying to better understand the relationship between these last 2 formulas ... thank you so much! $\endgroup$
    – konofoso
    Commented Jul 7 at 2:31
  • $\begingroup$ @ henry: please see the update - thanks! $\endgroup$
    – konofoso
    Commented Jul 7 at 4:18

1 Answer 1

4
$\begingroup$

Let ${p}_{x} = \mathbb {P} \left[ X = x \right]$. Then

$$\mathcal {G}_{X} \left( s \right) = \sum_{x \ge 0} {p}_{x} \, {s}^{x} = {p}_{0} + {p}_{1} \, s + {p}_{2} \, {s}^{2} + {p}_{3} \, {s}^{3} + {p}_{4} \, {s}^{4} + {p}_{5} \, {s}^{5} + \cdots$$

Differentiation yields

$$\begin{align} \mathcal {G}_{X}^{\left( 1 \right)} \left( s \right) & = {p}_{1} + 2 {p}_{2} \, s + 3 {p}_{3} \, {s}^{2} + 4 {p}_{4} \, {s}^{3} + 5 {p}_{5} \, {s}^{4} + \cdots \\ \mathcal {G}_{X}^{\left( 2 \right)} \left( s \right) & = 2 {p}_{2} + 6 {p}_{3} \, s + 12 {p}_{4} \, {s}^{2} + 20 {p}_{5} \, {s}^{3} + \cdots \\ \mathcal {G}_{X}^{\left( 3 \right)} \left( s \right) & = 6 {p}_{3} + 24 {p}_{4} \, s + 60 {p}_{5} \, {s}^{2} + \cdots \\ \mathcal {G}_{X}^{\left( 4 \right)} \left( s \right) & = 24 {p}_{4} + 120 {p}_{5} \, s + \cdots \\ \mathcal {G}_{X}^{\left( 5 \right)} \left( s \right) & = 120 {p}_{5} + \cdots \end{align}$$

So

$$\mathcal {G}_{X}^{\left( x \right)} \left( 0 \right) = x! \cdot {p}_{x},$$

Accordingly,

$$\boxed {{p}_{x} = \frac {\mathcal {G}_{X}^{\left( x \right)} \left( 0 \right)}{x!}.}$$

(Compare this formula with the Maclaurin series.)

$\endgroup$
3
  • $\begingroup$ @ Simon: thank you so much for the answer! $\endgroup$
    – konofoso
    Commented Jul 7 at 4:42
  • $\begingroup$ What do you think of the analysis I added in Note 1 and Note 2 to justify the formula? Does it look correct to you? thank you so much for everything! $\endgroup$
    – konofoso
    Commented Jul 7 at 4:42
  • $\begingroup$ @konofoso, both seem to be correct. $\endgroup$
    – Simon
    Commented Jul 7 at 5:44

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .