4
$\begingroup$

I have a question about the time complexity of finding the nth Fibonacci number using matrices.

I know that you can find $F_n$ from:

$ \begin{pmatrix} 1 & 1\\ 1 & 0\\ \end{pmatrix}^n = \begin{pmatrix} F_{n+1} & F_n\\ F_n & F_{n-1}\\ \end{pmatrix} $

From above, $F_n$ is the entry in row 1, column 2 of the matrix. I have read in an online source that raising the Fibonacci Q-matrix to the power of n takes O(n) time. I am not sure why this is the case. Is matrix multiplication an operation that can be done in constant time on computers? That is the only reason I can see the above operation taking O(n) time (since you would have to do n-1 multiplications to raise the Q-matrix to the power of n). Any insights are appreciated.

$\endgroup$
5
  • $\begingroup$ (I think you have a typo in the equation: the right-hand side should not be raised to a power?) Yes, how much time do you think it takes to compute the product of two $2 \times 2$ matrices? $\endgroup$
    – angryavian
    Commented Apr 4, 2019 at 23:34
  • $\begingroup$ Yes, my bad, I will fix it. For computing the product of two 2 x 2 matrices, there are 8 multiplications and 4 additions that are involved. Multiplication and addition take constant time, but since there are 12 operations involved, would this not mean that the time taken is O(n)? $\endgroup$
    – ceno980
    Commented Apr 4, 2019 at 23:37
  • $\begingroup$ Multiplication of two matrices of bounded size (here $2\times2$) cannot take more than constant time, just because there is no $n$. $\endgroup$
    – user65203
    Commented Apr 4, 2019 at 23:48
  • $\begingroup$ To give you a taste of how matrix powers are efficiently computed, contemplate $A^2=AA, A^4=A^2A^2,A^8=A^4A^4,A^{16}=A^8A^8$ , then for instance $A^{20}=A^{16}A^4$ is obtained in $5$ multiplies. $\endgroup$
    – user65203
    Commented Apr 4, 2019 at 23:51
  • $\begingroup$ I see, by evaluating $A^2$ which takes one multiplication, you can then evaluate $A^4$ in one multiplication as well because you know $A^2$ and so on. $\endgroup$
    – ceno980
    Commented Apr 5, 2019 at 0:01

1 Answer 1

5
$\begingroup$

Multiplying two $2\times 2$ matrices can be done with eight multiplications and four additions, so yes, that is constant time. (In fact it can be done in fewer multiplications at the cost of more additions, but the point remains.)

Multiplying a $2\times 2$ matrix by $\begin{pmatrix}1 & 1\\1 & 0\\\end{pmatrix}$ can be done with just two additions, which is even faster.

But computing $F_n$ using $F_n=F_{n-1}+F_{n-2}$ takes just one addition, so why use matrices at all?

The answer is that computing $M^n$ for a matrix $M$ can actually be done using $O(\log n)$ multiplications and additions, using exponentiation by squaring. Check it out!

$\endgroup$
2
  • 3
    $\begingroup$ Multiplication and addition are only constant-time operations if the number of digits is fixed, no? But the number of digits of $F_n$ grows linearly. $\endgroup$ Commented Apr 4, 2019 at 23:55
  • $\begingroup$ @OscarLanzi Well, it takes $O(n)$ time just to read all the digits of $F_n$. So, $O(n \log n)$ maybe? $\endgroup$ Commented Apr 5, 2019 at 3:04

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .