Skip to main content

For questions relating to moment-generating-functions (m.g.f.), which are a way to find moments like the mean$~(μ)~$ and the variance$~(σ^2)~$. Finding an m.g.f. for a discrete random variable involves summation; for continuous random variables, calculus is used.

A moment generating function (MGF) is a single expected value function whose derivatives produce each of the required moments.

Definition: Let $X$ be a discrete random variable with probability mass function $f(x)$ and support $S$. Then:

$$M_X(t) = E(e^{tX})=\sum\limits_{x\in S} e^{tx}f(x)$$or, $$M_X(t) = E(e^{tX}) = \int_x e^{tx} f(x) \, \mathrm{d}x$$

is the MGF of $X$ as long as the summation is finite for some interval of $t$ around $0$.

i.e. $M(t)$ is the MGF of $X$ if there is a positive number $h$ such that the above summation exists and is finite for $−h<t<h$.

Note: There are basically two reasons for which MGF's are so important.

  • the MGF of $X$ gives us all moments of $X$.
  • the MGF (if it exists) uniquely determines the distribution. That is, if two random variables have the same MGF, then they must have the same distribution.

Thu if you find the MGF of a random variable, you have indeed determined its distribution.