2
$\begingroup$

Consider an $n \times n$ chessboard where the journey begins at the bottom-left corner $(1, 1)$ and concludes at the top-right corner $(n, n)$. How many distinct paths are available that necessitate exactly $k$ steps? The permissible movements are restricted to up, down, left, and right, and paths that self-intersect are allowed. While I am aware that a dynamic programming solution exists with a time complexity of $O(k n^2)$, I am intrigued to know if there is an analytical solution.

Let $a(i, j, s) $ denote the number of ways to travel from $(1, 1)$ to $(i, j)$, allowing for self-intersections.

The recurrence relation for this problem is given by: $$ a(i, j, s) = a(i - 1, j, s - 1) + a(i + 1, j, s - 1) + a(i, j - 1, s - 1) + a(i, j + 1, s - 1) $$ assuming no boundary violations.

The initial condition is set as follows: $$ a(1, 1, 0) = 1, \quad a(i, j, 0) = 0 \quad \text{for} \quad (i, j) \neq (1, 1) $$

The objective is to compute $a(n, n, k)$. Is there an analytical solution or an algorithmic approach that is faster than $O(n^2 k)$? Any references covering this topic would be appreciated.

$\endgroup$
1
  • $\begingroup$ For fixed $n$ you can write down a large ($n^2 \times n^2$) adjacency matrix whose $k^{th}$ power (specifically a specific entry of the $k^{th}$ power) solves this problem. This can be computed using binary exponentiation so you get something like $O(\log k)$ as a function of $k$ but as a tradeoff I suspect the complexity as a function of $n$ is worse (I don't know how much worse). $\endgroup$ Commented Sep 10, 2023 at 6:45

2 Answers 2

4
$\begingroup$

Yes, you can compute this using $O(k+k^2/n^2)$ integer additions and multiplications. In the regime where $n$ grows faster than $k^{1/4}$, my method is asymptotically superior to the $O(kn^2)$ dynamic programming method.

First, let us ignore the boundaries, and let $N_k(x,y)$ be the number of paths from $(0,0)$ to $(x,y)$ in the infinite 2D square grid, using exactly $k$ steps of up, left, down, and right. You can prove that $$ N_k(x,y)=\binom{k}{\frac12(k+x+y)}\binom{k}{\frac12(k+x-y)} $$ if and only if $k+x+y$ is even. If $k+x+y$ is odd, there are zero paths. You can pre-compute the entire list of binomial coefficients $\{\binom{k}{j}\}_{j=0}^k$ with $O(k)$ multiplications, and then $N_k(x,y)$ can be computed with two lookups and a multiplication.

There is then a tricky formula, based on the reflection principle for these lattice walks, which allows you to convert the above formula to the case of a walks within a finite $n\times n$ grid. First, some definitions.

  • Let $(x_1,y_1)$ and $(x_2,y_2)$ be two points in the $n\times n$ grid, meaning $x_1,x_2,y_1,y_2\in \{1,\dots,n\}$.

  • Let $C_\ell$ and $C_r$ be the columns of the infinite grid which form the left and right borders of the $n\times n$ square. Explicitly, $C_\ell=\{(0,y)\mid y\in \mathbb Z\}$, while $C_r=\{(n+1,y)\mid y\in \mathbb Z\}$,

  • Let $R_t$ and $R_b$ be the columns of the infinite grid which form the top and bottom borders of the $n\times n$ square. Explicitly, $R_b=\{(x,0)\mid x\in \mathbb Z\}$, while $R_t=\{(x,n+1)\mid x\in \mathbb Z\}$,

  • Let $S$ be the set of squares obtained by reflecting $(x_2,y_2)$ a finite number of times, where each reflection is through one of the borders of the grid; either through $C_\ell,C_r,R_t$ or $R_b$.

  • For each $(x,y)\in S$, let $d(x,y)$ be the minimum number of reflections it took to reach $(x,y)$.

  • Let $N^\text{boundary}_{n,k}((x_1,y_1),(x_2,y_2))$ be the number of walks with $k$ steps from $(x_1,y_1)$ to $(x_2,y_2)$ within the $n\times n$ grid.

You can then prove that $$ N^\text{boundary}_{n,k}((x_1,y_1),(x_2,y_2))=\sum_{(x',y')\in S} (-1)^{d(x,y)}\cdot N_k(x-x_1,y-y_1) $$ This is kind of like the principle of inclusion-exclusion. When $(x,y)=(x_2,y_2)$, so $d(x,y)=0$, we are adding up all of the paths to our destination, ignoring the boundaries. We then subtract the number of unrestricted paths to all reflections of $(x_2,y_2)$ through a single boundary, because these are in bijection with "bad" paths which actually hit the boundary wall, via the reflection principle. You then add up the double reflections, subtract the triple reflections, etc.

Note that, while $S$ is technically an infinite set, there are only $O(k^2/n^2)$ points in $S$ for which $N_k(x-x_1,y-y_1)$ is nonzero, because every reachable point has $|x-x_1|+|y-y_1|\le k$, and there is only one reflected point per each $n\times n$ reflected copy of the grid. Since we can compute each $N_k(x,y)$ with a constant number of multiplications (after an $O(k)$ pre-processing step), this means we can compute that final sum using $O(k^2/n^2)$ multiplications.


Note that while there are $O(k+k^2/n^2)$ arithmetic operations, the actual time complexity of the problem is $\tilde O(k^2+k^3/n^2)$, ignoring log factors. This is because the cost of a single multiplication is $\tilde O(k)$, because the numbers involved have $O(k)$ digits (the largest element in the $k^\text{th}$ row of Pascals triangle is $\sim 2^k/\sqrt k$). Similarly, the dynamic programming time complexity is actually $O(n^2k^2)$, when you incorporate the time cost of adding up the $O(k)$ sized integers in each DP cell.

$\endgroup$
3
$\begingroup$

There are $n\times n$ different positions on the board. Each step allows us to walk from one position to one of the neighboring positions. One way to represent this is as an $n^2\times n^2$ matrix where the $n^2$ unit vectors correspond to each position.

Let the unit vectors be denoted $e_{(x,y)}$ for position $(x,y)$ on the board. The transition matrix $X$ has elements $$ X_{(x,y),(x',y')} = \begin{cases} 1\quad\text{if you can step from $(x,y)$ to $(x',y')$}\\ 0\quad\text{otherwise} \end{cases} $$ where you might want to enumerate the positions $(x,y)$ on the board by indexes $1,\ldots,n^2$ if you prefer.

Eg, if $n=2$, and we list the positions on the board in the order $(1,1)$, $(1,2)$, $(2,1)$, $(2,2)$, the unit vector corresponding to position $(1,1)$ would be $e_{(1,1)} = [1,0,0,0]$, and for the end-point $(2,2)$ it is $e_{(2,2)}=[0,0,0,1]$. The transition matrix then becomes $$ X = \left[\begin{matrix} 0&1&1&0\\ 1&0&0&1\\ 1&0&0&1\\ 0&1&1&0\\ \end{matrix}\right] $$ with 1 corresponding to all neighboring positions.

Now, if we start at position $(1,1)$, represented by state $e_{(1,1)}$, taking one step corresponds to multiplying by $X$. The result is a vector where the coefficient for $e_{(x,y)}$ is the number of paths ending at position $(x,y)$. For each new step, we multiply by $X$. So, if we take $s$ steps, it corresponds to multiplying by $X^s$. Thus, $$ e_{(1,1)}\cdot X^s = \sum_{(x,y)} a(x,y,s)\, e_{(x,y)} $$ where $a(x,y,s)$ is the number of ways to get from $(1,1)$ to $(x,y)$ in exactly $s$ steps.

Multiplying two $n^2\times n^2$ matrices is an $O(n^6)$ operation (although there are faster methods), so this might not look like an improvement. However, computing $X^s$ requires $O(\ln s)$ multiplications using the square-and-multiply approach, so in total this takes time $O(n^6\cdot\ln s)$.

Since $X$ is symmetric, it can be diagonalised as $X=UDU^T$, ie $D$ diagonal and $UU^T=I$, which makes $X^s=UD^sU^T$. This allows us to express $$ a(n,n,s) = e_{(1,1)}\, X^s\, e_{(n,n)}^T = e_{(1,1)}\, U\, D^s\, U^T\, e_{(n,n)}^T $$ for which computing the diagonalisation takes $O(n^6)$ time if I remember correctly, and computing $D^s$ only involves computing the powers of the values on the diagonal which if done in the straight forward way takes $O(n^2\cdot\ln s)$ time using square-and-multiply on the diagonal values (ie eigenvalues of $X$).

You may note that this formula is actually a closed form formula, although one that requires that you carry out the diagonalisation of $X$ first. This is helpful if $n$ is fixed while $s$ increases, not so much if $n$ grows.

$\endgroup$
1
  • $\begingroup$ I find it interesting that all three algorithms here are sometimes optimal. Let's say multiplying two $n^2\times \times n^2$ matrices takes $O(n^5)$ time. In the regime where $n$ grows with $k$ at a rate slower than $k^{1/5}$, your method is fastest. In the regime where $n$ grows faster than $k^{1/4}$, my method is fastest. In the intermediate region, OP's dynamic programming method wins. $\endgroup$ Commented Sep 11, 2023 at 18:42

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .