55
$\begingroup$

Suppose $X$ is a random variable which follows standard normal distribution then how is $KX$ ($K$ is constant) defined. Why does it follow a normal distribution with mean $0$ and variance $K^2$. Thank You.

$\endgroup$
1
  • 6
    $\begingroup$ I see you are a new member to MSE. A Hearty Welcome!. Please make a note to accept (if it is acceptable to you) the answers to your previous questions. This motivates (mostly) MSE to help you more. $\endgroup$ Commented Jan 11, 2013 at 6:08

5 Answers 5

50
$\begingroup$

For a random variable $X$ with finite first and second moments (i.e. expectation and variance exist) it holds that $\forall c \in \mathbb{R}: E[c \cdot X ] = c \cdot E[X]$ and $ \mathrm{Var}[c\cdot X] = c^2 \cdot \mathrm{Var} [ X]$

However the fact that $c\cdot X$ follows the same family of distributions as does $X$ is not trivial and has to be shown seperately. (Not true e.g. for the Beta distribution, which is also in the exponential family). You can see it if you look at the characteristic function of the product $c\cdot X$: $ \exp\{i\mu c t - \frac{1}{2} \sigma^2 c^2 t^2\}$ which is the characteristic function of a normal distribution wih $\mu'= \mu\cdot c$ and $\sigma' = \sigma \cdot c$.

$\endgroup$
3
  • 24
    $\begingroup$ The first paragraph is a perfectly correct answer (except for a missing $E$ in $E[c\cdot X]=c\cdot E[X]$) but I am not sure I understand the issue in the second paragraph. If $X$ is a continuous random variable with probability density function $f_X(x)$, then, for $c\neq 0$, so is $c\cdot X$ a continuous random variable with probability density function $$f_{c\cdot X}(a)=\frac{1}{|c|}f_X\left(\frac{a}{c}\right)$$ which, if not belonging to the same nuclear family of distributions, is at least a kissing cousin of the family, and of course, $c\cdot X$ is normal if $X$ is normal. $\endgroup$ Commented Jan 11, 2013 at 14:38
  • $\begingroup$ Thanks for pointing out the missing "$E$". $\endgroup$ Commented Jan 11, 2013 at 15:34
  • $\begingroup$ Is there a standard source I can cite for the displayed equation in Dilip's comment? $\endgroup$
    – ben
    Commented Jul 24, 2021 at 17:16
23
$\begingroup$

Another way of characterizing a random variable's distribution is by its distribution function, that is, if two random variables have the same distribution function then they are equal.

In our case, let $X \sim N(\mu,\sigma^2)$ then set $Y = c X$ with $c > 0$ and call $F$ the distribution function of $X$ and $G$ the distribution function of $Y$. Then:

$G(y) = P[Y \le y] = P[cX \le y] = P\Big[X \le \frac yc\Big] = F\Big(\frac yc\Big)$

Now we differentiate and we get:

$g(y) = f(\frac yc) \frac1c$

where $g$ is the density function for $Y$ and $f$ is the density function for $X$. Then we just try to express this as a normal density:

$g(y) = f(\frac yc ) \frac1c = \frac{1}{\sqrt{2\pi}(c\sigma)} e^{\frac{-(cx-c\mu)^2}{2(c\sigma)^2}} = \frac{1}{\sqrt{2\pi}(c\sigma)} e^{\frac{-(y-c\mu)^2}{2(c\sigma)^2}}$

But this last is the density of a $N(c\mu,(c\sigma)^2)$

This is a calmed formulation of what Dilip Sarwate pointed out in the comments before.

The case c < 0

$G(y) = P[Y \le y] = P[cX \le y] = P\Big[X \ge \frac yc\Big] = 1 - F\Big(\frac yc\Big)$

differentiating:

$g(y) = -f(\frac yc) \frac1c = f(\frac yc) \frac{1}{|c|} = \frac{1}{\sqrt{2\pi}(|c|\sigma)} e^{\frac{-(y-c\mu)^2}{2(c\sigma)^2}}$

Note that this does not pose difficulties since $\sqrt{(c \sigma)^2} = |c| \sigma$.

$\endgroup$
3
  • 6
    $\begingroup$ derive -> differentiate $\endgroup$ Commented Dec 21, 2019 at 1:07
  • 2
    $\begingroup$ But what would happen if c can be anything, not only positive? $\endgroup$
    – Karina
    Commented Dec 8, 2020 at 17:49
  • $\begingroup$ @SAT I updated my answer to cover $c < 0$ case. The thing to realize is $\sqrt{(c \sigma)^2} = |c| \sigma$, thanks for your comment $\endgroup$ Commented Jan 22, 2021 at 8:06
10
$\begingroup$

I'll try to present it in a way which is relatively intuitive, but still maintains some mathematical rigor.

Let $Y=kX$, where X ~ $N(0,1)$.

Now, we can see $Y=X+X+X.......k$ times. Thus, we can get the expected value of Y and the variance of Y using linearity.

$E(Y)=E(X+X+X...)=E(X)+E(X)+....k$ times $=k\mu$ (using linearity of expectation).

To, we can get $Var(Y)=Var(kX)=E((kX)^2)-(E(kX))^2$ (by definition of Variance)

So, $Var(Y)= E(k^2X^2)-(E(kX))^2=k^2(E(X^2))-(k.E(X))^2$ (using above proved result for $E(kX)$)

Rewriting, $Var(Y)= k^2E(X^2)-k^2(E(X))^2=k^2(E(X^2)-(E(X))^2)=k^2Var(X)$

So, we now have $E(Y)$ and $Var(Y)$. We also know that the sum of independent normally distributed variables is normally distributed, so Y must be normal as Y is a sum of normally distributed variables.

So, basically you know have both the E and the Var of a normally distributed variable, which tells you the distribution.

$Y$ ~ $N(k\mu,k^2\sigma)$, where $\mu=Mean(X)$ and $\sigma=Var(X)$.

In your case, $\mu=0$. Hope it helps!

$\endgroup$
5
  • 1
    $\begingroup$ You are saying: "We also know that the sum of independent normally distributed variables is normally distributed" but surely X isn't independent of X (you're working with the sum of X's) which I think is exactly the kernel of this Q. $\endgroup$
    – Lola
    Commented Jun 9, 2017 at 16:08
  • 1
    $\begingroup$ While there are many proofs for the statement that the sum of 2 normally distributed random variables is a normal distribution (look up wikipedia for other proofs), the most intuitive one is using MGF's, ie moment generating functions. Here's a proof - onlinecourses.science.psu.edu/stat414/node/172 $\endgroup$ Commented Aug 15, 2018 at 3:22
  • $\begingroup$ Good point @Lola: Correction, sum of ANY two normal variables is normal. They do not have to be independent. Also, Linearity holds for dependent random variables as well. $\endgroup$ Commented Nov 1, 2019 at 6:36
  • $\begingroup$ @spandanmadan No that is false. Can find several counterexamples on wiki or stackexchange or just by googling. $\endgroup$
    – ScapeProf
    Commented Jun 2, 2022 at 8:11
  • 1
    $\begingroup$ $Y$ ~ $N(k\mu,k^2\sigma)$ should actually be $Y$ ~ $N(k\mu,k^2\sigma^2)$. Itried to edit, but I don't have enough reputation. $\endgroup$ Commented Mar 10, 2023 at 5:27
3
$\begingroup$

Use the definition of expectation of function of a random variable and variance of function of a random variable. If $g(X)=KX$, what is its mean and variance? This result is very general and is not unique to normal distributions alone.

$\endgroup$
0
$\begingroup$

For a transformation $y = ax$ the pdf transforms as $f(y) = f(x)|J|$ where $J$ is the Jacobian matrix.

Consider $f(x)$ as standard normal for simplicity.

\begin{equation*} \begin{aligned}[b] f(x) = \frac{1}{\sqrt{2\pi}}e^{-\frac{1}{2}x^2}dz \end{aligned} \label{eq:1} \end{equation*}

We have $|J| = |\frac{d}{dy}x| = |\frac{d}{dy} (\frac{y}{a})| = |\frac{1}{a}|$. Also substituting $x = \frac{y}{a}$ in the pdf itself we get,

\begin{equation*} \begin{aligned}[b] f(y) = |\frac{1}{a}|\frac{1}{\sqrt{2\pi}}e^{-\frac{1}{2}(\frac{y}{a})^2}dz \end{aligned} \end{equation*}

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .