There are $n\times n$ different positions on the board. Each step allows us to walk from one position to one of the neighboring positions. One way to represent this is as an $n^2\times n^2$ matrix where the $n^2$ unit vectors correspond to each position.
Let the unit vectors be denoted $e_{(x,y)}$ for position $(x,y)$ on the board. The transition matrix $X$ has elements
$$
X_{(x,y),(x',y')} =
\begin{cases}
1\quad\text{if you can step from $(x,y)$ to $(x',y')$}\\
0\quad\text{otherwise}
\end{cases}
$$
where you might want to enumerate the positions $(x,y)$ on the board by indexes $1,\ldots,n^2$ if you prefer.
Eg, if $n=2$, and we list the positions on the board in the order $(1,1)$, $(1,2)$, $(2,1)$, $(2,2)$, the unit vector corresponding to position $(1,1)$ would be $e_{(1,1)} = [1,0,0,0]$, and for the end-point $(2,2)$ it is $e_{(2,2)}=[0,0,0,1]$. The transition matrix then becomes
$$
X =
\left[\begin{matrix}
0&1&1&0\\
1&0&0&1\\
1&0&0&1\\
0&1&1&0\\
\end{matrix}\right]
$$
with 1 corresponding to all neighboring positions.
Now, if we start at position $(1,1)$, represented by state $e_{(1,1)}$, taking one step corresponds to multiplying by $X$. The result is a vector where the coefficient for $e_{(x,y)}$ is the number of paths ending at position $(x,y)$. For each new step, we multiply by $X$. So, if we take $s$ steps, it corresponds to multiplying by $X^s$. Thus,
$$
e_{(1,1)}\cdot X^s = \sum_{(x,y)} a(x,y,s)\, e_{(x,y)}
$$
where $a(x,y,s)$ is the number of ways to get from $(1,1)$ to $(x,y)$ in exactly $s$ steps.
Multiplying two $n^2\times n^2$ matrices is an $O(n^6)$ operation (although there are faster methods), so this might not look like an improvement. However, computing $X^s$ requires $O(\ln s)$ multiplications using the square-and-multiply approach, so in total this takes time $O(n^6\cdot\ln s)$.
Since $X$ is symmetric, it can be diagonalised as $X=UDU^T$, ie $D$ diagonal and $UU^T=I$, which makes $X^s=UD^sU^T$. This allows us to express
$$
a(n,n,s) = e_{(1,1)}\, X^s\, e_{(n,n)}^T
= e_{(1,1)}\, U\, D^s\, U^T\, e_{(n,n)}^T
$$
for which computing the diagonalisation takes $O(n^6)$ time if I remember correctly, and computing $D^s$ only involves computing the powers of the values on the diagonal which if done in the straight forward way takes $O(n^2\cdot\ln s)$ time using square-and-multiply on the diagonal values (ie eigenvalues of $X$).
You may note that this formula is actually a closed form formula, although one that requires that you carry out the diagonalisation of $X$ first. This is helpful if $n$ is fixed while $s$ increases, not so much if $n$ grows.