2
$\begingroup$

I am trying to sharpen my intuition on some random-walk style results.

Suppose we are looking at a random walk on $\mathbb{Z}$ starting at $0$. At the $k$th step, we either walk to the left $k$ steps or to the right $k$ steps, each equally likely. What is the expected absolute distance from the origin after $n$ steps?

Equivalently,

What is the expected absolute value of the sum $$ \sum_{k = 1}^n \hat k \tag{1}$$ where $\hat k$ is a random variable that takes the value $k$ or $-k$, each with probability $1/2$?

I am familiar with the idea that adding $\sum_{k = 1}^n \widehat{ 1}$, by which I mean adding $n$ random variables that are either $1$ or $-1$ with probability $1/2$, results in expected magnitude about $\sqrt n$ (ignoring factors of $2\pi$). This sort of square root cancellation comes up in my work frequently.

More recently, randomly-signed sums of nonconstant terms have become prevalent in my work, and I'm trying to get an intuitive grasp on how they behave. With respect to $(1)$ above, if they were all positive we would get something like $n^2$. If we had mere square root cancellation, we would get something like $n^{3/2}$. But I suspect (without too much foundation; a gut instinct, I suppose) we should have more than square-root cancellation, although I do not quite know what to really expect.

$\endgroup$
8
  • $\begingroup$ Actually, numerical simulation suggests mere square root cancellation perhaps. That does not please my intuition. So I suspect this exercise really will sharpen my intuition. $\endgroup$
    – davidlowryduda
    Commented Sep 24, 2015 at 1:18
  • $\begingroup$ When you say "distance from the origin", do you mean the absolute value of the distance? Because the signed distance would have expected value of $0$ (by symmetry). Also, the way you define $\hat{k}$, it similarly has expectation $0$ by symmetry and hence their sum over $n$ steps has expectation $0$. So I suppose this isn't what you want. Can you clarify? $\endgroup$
    – Mick A
    Commented Sep 24, 2015 at 4:57
  • $\begingroup$ @MickA Yes, I mean the standard distance, like the normal metric or distance function which only takes nonnegative values. $\endgroup$
    – davidlowryduda
    Commented Sep 24, 2015 at 5:09
  • $\begingroup$ This might help you: math.stackexchange.com/questions/103142/… $\endgroup$
    – Mick A
    Commented Sep 24, 2015 at 5:35
  • 1
    $\begingroup$ Considering $E(S_n^2)$, which (much like the ordinary random walk) is a lot easier to find and which turns out to be $O(n^3)$, thus making the r.m.s. distance $O(n^{3/2})$, it seems kind of natural to expect $E(|S_n|)\sim n^{3/2}$, too. $\endgroup$ Commented Sep 24, 2015 at 11:25

1 Answer 1

1
$\begingroup$

In an arbitrary dimension d:

Let $\vec{R}$ be the end-to-end distance vector of a random walk of fixed step length $|\vec{r}_i| = l$. $\vec{R}$ can then be expressed as $\displaystyle \vec{R} = \sum_{i=1}^N \vec{r}_i$, where $\vec{r}_i$ is the vector of the $i$-th step. The Root-Mean-Square End-to-End Distance is given by $\textrm{RMS}=\sqrt { \langle R^2 \rangle }$. Since the steps are mutually independent, the covariance of two steps $\vec{r}_i$ and $\vec{r}_j$ is zero if $i\neq j$ and $\textrm{Cov}(\vec{r}_i, \ \vec{r}_j)= \textrm{Var}(\vec{r}_i)$ if $i=j$. The variance of $ \vec{r}_i$ can be expressed as $ \textrm{Var}(\vec{r}_i)= \langle \vec{r}_i \cdot \vec{r}_i \rangle - \langle \vec{r}_i \rangle^2$. Due to symmetry $\langle \vec{r}_i \rangle=\vec{0}$ and therefore the variance of of $ \vec{r}_i$ is simply $ \textrm{Var}(\vec{r}_i)= \langle \vec{r}_i \cdot \vec{r}_i \rangle = |\vec{r}_i|^2 = l^2$. Altogether, the covariance of $\vec{r}_i$ and $\vec{r}_j$ equals $\textrm{Cov}(\vec{r}_i, \ \vec{r}_j)=\delta_{ij}l^2$. The covariance of $\vec{r}_i$ and $\vec{r}_j$ can also be expressed as $\textrm{Cov}(\vec{r}_i, \ \vec{r}_j) = \langle \vec{r}_i \cdot \vec{r}_j \rangle - \langle \vec{r}_i \rangle \cdot \langle \vec{r}_j \rangle$. Combining the two different expressions for the covariance and using that $\langle \vec{r}_i \rangle=0$, results in $\langle \vec{r}_i \cdot \vec{r}_j \rangle =\delta_{ij}l^2$. This result can be used to determine the RMS:

$$\textrm{RMS}=\sqrt { \langle R^2 \rangle } = \sqrt { \langle \vec{R} \cdot \vec{R} \rangle } =\sqrt { \big\langle \sum_{i=1}^N \vec{r}_i \cdot \sum_{j=1}^N \vec{r}_j \big\rangle } =\sqrt { \sum_{i=1}^N \sum_{j=1}^N \langle \vec{r}_i \cdot \vec{r}_j \rangle }= $$ $$=\sqrt { \sum_{i=1}^N \sum_{j=1}^N l^2 \delta_{ij} + 0^2}=\sqrt { \sum_{i=1}^N l^2}=\sqrt { N l^2}=l\sqrt { N }$$

Let $Z_i$ denote the $i$-th coordinate of the end-to-end distance vector $\vec{R}$ after $N$ steps, and let $X_i$ and $Y_i$ denote the number of steps taken in the $i$-th dimension in the positive and negative direction respectively. Then the set of random variables $\{X_i, Y_i\}_{i=1}^d$ follows a multinomial distribution with parameters $N$ and $\displaystyle p_i=\frac{N}{2d}$. For sufficiently large values of $N$, $\{X_i, Y_i\}_{i=1}^d$ are approximately iid (independent and identically distributed) Poisson random variables with parameters $\displaystyle \lambda_i = \frac{N}{2d}$. For $\lambda > 20$, i.e. $N>40d$, $\textrm{Po}(\lambda) \sim \textrm{N}(\lambda, \lambda)$. $ Z_i = l(X_i - Y_i)$ and therefore $\displaystyle Z_i \sim \textrm{N}(l(\lambda - \lambda), l^2(\lambda+\lambda))=\textrm{N}(0, 2l\lambda)=\textrm{N}\left(0, \frac{l^2N}{d}\right)$.

$\displaystyle \langle R \rangle = \langle \sqrt{R^2} \rangle = \left\langle \sqrt{ \sum_{i=1}^d Z_i^2} \right\rangle$. The square root of a sum of $k$ independent $\textrm{N}(0, 1)$-distributed random variables is distributed according to the chi distribution, $\chi_k$. Therefore $\displaystyle \sqrt{ \sum_{i=1}^d \frac{dZ_i^2}{l^2N}}$ is approximately $\chi_d$-distributed for large values of $N$. The expected value of a $\chi_k$-distributed random variable is $\displaystyle \sqrt{2} \frac{ \Gamma \left(\frac{k+1}{2}\right) }{\Gamma \left( \frac{k}{2}\right)}$.

Hence $\displaystyle \langle R \rangle =\left\langle\sqrt{ \sum_{i=1}^d Z_i^2}\right\rangle =\left\langle l \sqrt{\frac{N}{d}} \sqrt{ \sum_{i=1}^d \frac{dZ_i^2}{l^2N} }\right\rangle = l \sqrt{\frac{2N}{d} }\frac{ \Gamma \left(\frac{d+1}{2}\right) }{\Gamma \left( \frac{d}{2}\right)}$.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .