29
$\begingroup$

I am trying to understand the link between the moment-generating function and characteristic function. The moment-generating function is defined as: $$ M_X(t) = E(\exp(tX)) = 1 + \frac{t E(X)}{1} + \frac{t^2 E(X^2)}{2!} + \dots + \frac{t^n E(X^n)}{n!} $$

Using the series expansion of $\exp(tX) = \sum_0^{\infty} \frac{(t)^n \cdot X^n}{n!}$, I can find all the moments of the distribution for the random variable X.

The characteristic function is defined as: $$ \varphi_X(t) = E(\exp(itX)) = 1 + \frac{it E(X)}{1} - \frac{t^2 E(X^2)}{2!} + \ldots + \frac{(it)^n E(X^n)}{n!} $$

I don't fully understand what information the imaginary number $i$ gives me more. I see that $i^2 = -1$ and thus we don't have only $+$ in the characteristic function, but why do we need to subtract moments in the characteristic function? What's the mathematical idea?

$\endgroup$
6
  • 9
    $\begingroup$ One important point is that the moment-generating function is not always finite! (See this question, for example.) If you want to build a general theory, say, about convergence in distribution, you'd like to be able to have it work with as many objects as possible. The characteristic function is, of course, finite for any random variable since $|\exp(itX)| \leq 1$. $\endgroup$
    – cardinal
    Commented Nov 22, 2012 at 15:59
  • 1
    $\begingroup$ The similarities in the Taylor expansions still allow one to read off the moments, when they exist, but note that not all distributions have moments, so the interest in these functions goes far beyond this! :) $\endgroup$
    – cardinal
    Commented Nov 22, 2012 at 15:59
  • 8
    $\begingroup$ Another point to note is that the MGF is the Laplace transformation of a random variable and the CF is the Fourier transform. There are fundamental relationships between these integral transforms, see here. $\endgroup$ Commented Nov 22, 2012 at 17:52
  • $\begingroup$ I thought CF is the inverse fourier transform (and not the fourier transform) of a propability distribution? $\endgroup$
    – Giuseppe
    Commented Nov 23, 2012 at 10:04
  • 1
    $\begingroup$ The distinction is only a matter of sign in the exponent, and possibly a multiplicative constant. $\endgroup$
    – Glen_b
    Commented Mar 13, 2013 at 14:28

2 Answers 2

18
$\begingroup$

As mentioned in the comments, characteristic functions always exist, because they require integration of a function of modulus $1$. However, the moment generating function doesn't need to exist because in particular it requires the existence of moments of any order.

When we know that $E[e^{tX}]$ is integrable for all $t$, we can define $g(z):=E[e^{zX}]$ for each complex number $z$. Then we notice that $M_X(t)=g(t)$ and $\varphi_X(t)=g(it)$.

$\endgroup$
1
  • 1
    $\begingroup$ +1. But you don't need such a strong assumption. It suffices that $e^{tX}$ be integrable in a neighborhood of $0.$ Some simple estimates and basic theorems of Complex Analysis take care of the rest. $\endgroup$
    – whuber
    Commented Feb 27, 2023 at 15:39
0
$\begingroup$

From Proakis, Digital communication 5th ed., the straightforward relationship is

$$ \varphi(\omega) = M(j\omega) $$

and

$$ M_x(t) = \varphi_x(-j t) $$

$\endgroup$
1
  • 1
    $\begingroup$ Hi Rubem. Welcome to CV. Just wondering how your answer adds anything new, if you could elaborate that perhaps. $\endgroup$ Commented Feb 27, 2023 at 15:19

Not the answer you're looking for? Browse other questions tagged or ask your own question.