158
$\begingroup$

Show that the determinant of a matrix $A$ is equal to the product of its eigenvalues $\lambda_i$.

So I'm having a tough time figuring this one out. I know that I have to work with the characteristic polynomial of the matrix $\det(A-\lambda I)$. But, when considering an $n \times n$ matrix, I do not know how to work out the proof. Should I just use the determinant formula for any $n \times n$ matrix? I'm guessing not, because that is quite complicated. Any insights would be great.

$\endgroup$
6
  • 4
    $\begingroup$ This is only true if there are $n$ distinct eigenvalues. In that case, you will have a diagonalisation of the matrix, so it is immediate from the multiplicative property of $\det$. $\endgroup$ Commented Jan 22, 2015 at 5:45
  • 1
    $\begingroup$ @user1537366 are you saying that this is not necessarily true in cases where the eigenvalues have multiplicity > 1? $\endgroup$
    – makansij
    Commented Nov 1, 2017 at 4:29
  • 1
    $\begingroup$ @user1537366, is it the product of all eigenvalues, or only a product of the set of distinct eigenvalues? thanks you. $\endgroup$
    – makansij
    Commented Aug 26, 2018 at 22:47
  • 1
    $\begingroup$ The statement in the question was correct. The product of all eigenvalues (repeated ones counted multiple times) is equal to the determinant of the matrix. $\endgroup$
    – inavda
    Commented Mar 23, 2019 at 20:40
  • 2
    $\begingroup$ @inavda I meant $\begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix}$. $\endgroup$ Commented Jun 1, 2019 at 18:51

8 Answers 8

244
$\begingroup$

Suppose that $\lambda_1, \ldots, \lambda_n$ are the eigenvalues of $A$. Then the $\lambda$s are also the roots of the characteristic polynomial, i.e.

$$\begin{array}{rcl} \det (A-\lambda I)=p(\lambda)&=&(-1)^n (\lambda - \lambda_1 )(\lambda - \lambda_2)\cdots (\lambda - \lambda_n) \\ &=&(-1) (\lambda - \lambda_1 )(-1)(\lambda - \lambda_2)\cdots (-1)(\lambda - \lambda_n) \\ &=&(\lambda_1 - \lambda )(\lambda_2 - \lambda)\cdots (\lambda_n - \lambda) \end{array}$$

The first equality follows from the factorization of a polynomial given its roots; the leading (highest degree) coefficient $(-1)^n$ can be obtained by expanding the determinant along the diagonal.

Now, by setting $\lambda$ to zero (simply because it is a variable) we get on the left side $\det(A)$, and on the right side $\lambda_1 \lambda_2\cdots\lambda_n$, that is, we indeed obtain the desired result

$$ \det(A) = \lambda_1 \lambda_2\cdots\lambda_n$$

So the determinant of the matrix is equal to the product of its eigenvalues.

$\endgroup$
11
  • 16
    $\begingroup$ Interesting, but don't you also have to show that the leading coefficient of the polynomial is 1? Or is that obvious? $\endgroup$
    – DanielV
    Commented Sep 28, 2013 at 9:05
  • 21
    $\begingroup$ $\det(A-\lambda I)=(\lambda_1 - \lambda)^{m_1}(\lambda_2 - \lambda)^{m_2}\cdots (\lambda_n - \lambda)^{m_n}.$ so $\det(A) = \lambda_1^{m_1} \lambda_2^{m_2}\cdots\lambda_n^{m_n}.$ with $m_i$ is the multiplicity of $\lambda_i$ $\endgroup$
    – mohamez
    Commented Jan 14, 2014 at 7:39
  • 4
    $\begingroup$ @omar "The leading (highest degree) coefficient $(-1)^{n}$ can be obtained by expanding the determinant along the diagonal." A quick search on google tells that determinant expansion along the diagonal isn't possible, however! Regardless, where does this $(-1)^{n}$ come from? $\endgroup$
    – Muno
    Commented Jul 14, 2016 at 1:11
  • 5
    $\begingroup$ @Muno: I believe that "Expanding the determinant along the diagonal" refers to the permuatation method of computing the determinant. This gives an polynomial with $n!$ terms, but only one will contribute $n$th power of $\lambda$: when the factors come from the diagonal. $\endgroup$ Commented Jan 29, 2018 at 20:18
  • 3
    $\begingroup$ I was confused how to justify the coefficient $(-1)^n$ and understood it like this: We know $p(\lambda)$ factors in $(\lambda-\lambda_1) \ldots (\lambda-\lambda_n)$. But if we multiply it out we get that the coefficient of $\lambda^n$ is $1$. This is not what we get when we compute $\det(A-\lambda I)$, because there we get an $(-1)^n \lambda^n$. So we need to add $(-1)^n$ manually so that our factorization is correct. $\endgroup$
    – mdcq
    Commented Apr 10, 2018 at 12:29
15
$\begingroup$

I am a beginning Linear Algebra learner and this is just my humble opinion.

One idea presented above is that

Suppose that $\lambda_1,\ldots \lambda_2$ are eigenvalues of $A$.

Then the $\lambda$s are also the roots of the characteristic polynomial, i.e.

$$\det(A−\lambda I)=(\lambda_1-\lambda)(\lambda_2−\lambda)\cdots(\lambda_n−\lambda)$$.

Now, by setting $\lambda$ to zero (simply because it is a variable) we get on the left side $\det(A)$, and on the right side $\lambda_1\lambda_2\ldots \lambda_n$, that is, we indeed obtain the desired result

$$\det(A)=\lambda_1\lambda_2\ldots \lambda_n$$.

I dont think that this works generally but only for the case when $\det(A) = 0$.

Because, when we write down the characteristic equation, we use the relation $\det(A - \lambda I) = 0$ Following the same logic, the only case where $\det(A - \lambda I) = \det(A) = 0$ is that $\lambda = 0$. The relationship $\det(A - \lambda I) = 0$ must be obeyed even for the special case $\lambda = 0$, which implies, $\det(A) = 0$

UPDATED POST

Here i propose a way to prove the theorem for a 2 by 2 case. Let $A$ be a 2 by 2 matrix.

$$ A = \begin{pmatrix} a_{11} & a_{12}\\ a_{21} & a_{22}\\\end{pmatrix}$$

The idea is to use a certain property of determinants,

$$ \begin{vmatrix} a_{11} + b_{11} & a_{12} \\ a_{21} + b_{21} & a_{22}\\\end{vmatrix} = \begin{vmatrix} a_{11} & a_{12}\\ a_{21} & a_{22}\\\end{vmatrix} + \begin{vmatrix} b_{11} & a_{12}\\b_{21} & a_{22}\\\end{vmatrix}$$

Let $ \lambda_1$ and $\lambda_2$ be the 2 eigenvalues of the matrix $A$. (The eigenvalues can be distinct, or repeated, real or complex it doesn't matter.)

The two eigenvalues $\lambda_1$ and $\lambda_2$ must satisfy the following condition :

$$\det (A -I\lambda) = 0 $$ Where $\lambda$ is the eigenvalue of $A$.

Therefore, $$\begin{vmatrix} a_{11} - \lambda & a_{12} \\ a_{21} & a_{22} - \lambda\\\end{vmatrix} = 0 $$

Therefore, using the property of determinants provided above, I will try to decompose the determinant into parts.

$$\begin{vmatrix} a_{11} - \lambda & a_{12} \\ a_{21} & a_{22} - \lambda\\\end{vmatrix} = \begin{vmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} - \lambda\\\end{vmatrix} - \begin{vmatrix} \lambda & 0 \\ a_{21} & a_{22} - \lambda\\\end{vmatrix}= \begin{vmatrix} a_{11} & a_{12} \\ a_{21} & a_{22}\\\end{vmatrix} - \begin{vmatrix} a_{11} & a_{12} \\ 0 & \lambda \\\end{vmatrix}-\begin{vmatrix} \lambda & 0 \\ a_{21} & a_{22} - \lambda\\\end{vmatrix}$$

The final determinant can be further reduced.

$$ \begin{vmatrix} \lambda & 0 \\ a_{21} & a_{22} - \lambda\\\end{vmatrix} = \begin{vmatrix} \lambda & 0 \\ a_{21} & a_{22} \\\end{vmatrix} - \begin{vmatrix} \lambda & 0\\ 0 & \lambda\\\end{vmatrix} $$

Substituting the final determinant, we will have

$$ \begin{vmatrix} a_{11} - \lambda & a_{12} \\ a_{21} & a_{22} - \lambda\\\end{vmatrix} = \begin{vmatrix} a_{11} & a_{12} \\ a_{21} & a_{22}\\\end{vmatrix} - \begin{vmatrix} a_{11} & a_{12} \\ 0 & \lambda \\\end{vmatrix} - \begin{vmatrix} \lambda & 0 \\ a_{21} & a_{22} \\\end{vmatrix} + \begin{vmatrix} \lambda & 0\\ 0 & \lambda\\\end{vmatrix} = 0 $$

In a polynomial $$ a_{n}\lambda^n + a_{n-1}\lambda^{n-1} ........a_{1}\lambda + a_{0}\lambda^0 = 0$$ We have the product of root being the coefficient of the term with the 0th power, $a_{0}$.

From the decomposed determinant, the only term which doesn't involve $\lambda$ would be the first term

$$ \begin{vmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} \\\end{vmatrix} = \det (A) $$

Therefore, the product of roots aka product of eigenvalues of $A$ is equivalent to the determinant of $A$.

I am having difficulties to generalize this idea of proof to the $n$ by $$ case though, as it is complex and time consuming for me.

$\endgroup$
3
  • 9
    $\begingroup$ In the previously given proof, the fact that p(λ) = 0 was not used. Hence it is valid for all A and not just det(A) = 0 $\endgroup$
    – Tyg13
    Commented Apr 1, 2017 at 23:57
  • $\begingroup$ The first part seems accurate as long as the characteristic polynomial can be factored in linear terms as you did. This always happens over $\mathbb{C}$ (due to the Fundamental Theorem of Algebra), but not always over $\mathbb{R}$. What happens if $p(\lambda)$ may not be factored in linear terms? Does the formula still holds? $\endgroup$
    – Marra
    Commented Oct 4, 2018 at 23:54
  • $\begingroup$ Tyg has already clarified, but I want to add one more thought. If Av = λv, then (A-λI)v=0. When a matrix transforms any vector v to 0, it means it is singular. So here (A-λI) is singular which means det of (A-λI) is 0. This holds true for ALL A which has λ as its eigenvalue. Though onimoni's brilliant deduction did not use the fact that the determinant =0, (s)he could have used it and whatever results/theorem came out of it would hold for all A. (for e.g. given the above situation prove that at least one of those eigenvalue should be 0) $\endgroup$
    – Allohvk
    Commented Sep 6, 2021 at 16:05
10
$\begingroup$

From eigen decomposition

$A = S \lambda S^{-1}$, where $\lambda$ is a matrix formed by eigen values of A.

$\implies det(A) = det(S)\phantom{1}det(\lambda)\phantom{1}det(S^{-1})$

$\implies det(A) = det(\lambda) $

$ det(\lambda)$ is nothing but $\lambda_1$$\lambda_2$....$\lambda_n$

$\endgroup$
4
  • 1
    $\begingroup$ Is this shown by saying $det(S) = 1/det(S^-1)$? $\endgroup$
    – makansij
    Commented Nov 1, 2017 at 4:27
  • $\begingroup$ Indeed. determinant satisfies that $det(AB) = det(A)det(B)$ for any two $n\times n$ matrices A, B. Since $S S^{ -1}=Id$ then $det(S) det(S^{ -1})= det(Id) = 1$. $\endgroup$
    – eduard
    Commented Nov 11, 2017 at 14:06
  • 8
    $\begingroup$ but this decomposition only exists, if A is diagonalizable $\endgroup$
    – user519338
    Commented Apr 1, 2018 at 21:01
  • 3
    $\begingroup$ As noted above, this does not work if $A$ is not diagonalizable. Instead, you could use a square matrix $T$ over the algebraic closure of the given field s.t. $T^{-1}AT$ has Jordan form. $\endgroup$
    – qwertz
    Commented Jul 9, 2018 at 14:06
7
$\begingroup$

Instead of assuming that the matrix is diagonalisable, as done in some of the previous answers, we can use the Jordan form. That is, every matrix has an associated Jordan form through a similarity transformation: $$J=M^{-1}AM$$ for a certain invertible $M$. Since $J$ is triangular, its determinant is simply the product of its diagonal entries, which also happen to be the eigenvalues of $A$. That is, $$\det(J)=\prod_i{\lambda_i(A)}$$ Where $\lambda_i(A)$ denotes the $i$th eigenvalue of $A$.

Computing the determinants, we have $$\det(J)=\det(M^{-1})\det(A)\det(M)$$ The determinants of $M$ and $M^{-1}$ cancel to give $1$, and so $$\det(J)=\det(A)$$ Combining our second and fourth equations we have the result: $$\det(A)=\prod_i{\lambda_i(A)}=\lambda_1\lambda_2...\lambda_n$$ .

TL;DR:

$\det(A)=\det(M^{-1})\det(J)\det(M)$

$\implies \det(A) = \det(J)$.

And since $J$ is triangular and has eigenvalues along its diagonal, $\det(A)=\det(J)=\lambda_1\lambda_2...\lambda_n$.

$\endgroup$
5
$\begingroup$

The approach I would use is to Decompose the matrix into 3 matrices based on the eigenvalues.

Then you know that the $det(A*B) = det(A)*det(B)$, and that $det(inv(A)) = \dfrac{1}{det(A)}$.

You can probably fill in the rest of the details from the article, depending on how rigorous your proof needs to be.

Edit: I just realized this won't work on all matrices, but it might give you an idea of an approach.

$\endgroup$
4
  • 1
    $\begingroup$ Why not? Any matrix has some "eigendecomposition": Schur, Jordan,... $\endgroup$ Commented Sep 29, 2013 at 11:20
  • 2
    $\begingroup$ I liked this. I did it this way, but is this a correct proof? a) If matrix $A$ has linearly independent columns. $$A=SDS^{-1}$$ now take $\det$ of both sides $$\det(A)=\det(SDS^{-1})=\det(S)\det(D)\det(S^{-1})=\det(D)$$ and $\det(D)$ is just the product of all $\lambda_i$. b) If matrix $A$ has linearly de pendent columns. Then $$\det(A)=0$$ but what are it's eigenvalues? $\endgroup$
    – jacob
    Commented Mar 8, 2014 at 7:11
  • 2
    $\begingroup$ @jacob Having linearly independent columns does not imply diagonalisable... $\endgroup$ Commented Jan 22, 2015 at 5:47
  • $\begingroup$ His Idea is good but needs more arguments the D will be a Jordan Block matrix. Such matrices are almost diagonal. $\endgroup$
    – Kori
    Commented Sep 25, 2016 at 2:07
3
$\begingroup$

You must know the following:

== If we take an extension of the basis field then both the determinant and the trace of a (square) matrix remain unchanged when evaluating them in the new field

== Take a splitting field of the characteristic polynomial of $\;A\;$ and calculate this matrix's Jordan Canonical form. Since this last is a triangular matrix its determinant is the product of the elements in its main diagonal, and we know that in this diagonal appear the eigenvalues of $\;A\;$ so we're done.

$\endgroup$
2
$\begingroup$

I think this is right...

Write $$A = \begin{pmatrix}a_{11} & a_{12}&\cdots&a_{1n}\\\ \vdots & \vdots&&\vdots\\\ a_{n1}& a_{n2}&\cdots& a_{nn} \end{pmatrix}$$

Let the eigenvalues of $A$ be $\lambda_1, \dots , \lambda_n$ (not necessarily distinct). Finally, denote the characteristic polynomial of $A$ by $p(\lambda) = |\lambda I − A| = \lambda_n + c_{n−1}\lambda{n−1} + \cdots + c_1λ + c_0$.

Note that since the eigenvalues of $A$ are the roots of $p(\lambda)$, this implies that $p(\lambda)$ can be factorised as $p(\lambda) = (\lambda − \lambda_1)\cdots(\lambda − \lambda_n)$. Consider the constant term of $p(λ), c_0$. The constant term of $p(\lambda)$ is given by $p(0)$, which can be calculated in two ways:

Firstly, $p(0) = (0 − λ_1)\cdots(0 − λ_n) = (−1)^nλ_1 \cdots λ_n$.

Secondly, $p(0) = |0I − A| = | − A| = (−1)^n |A|$.

Therefore $c_0 = (−1)^nλ_1 \cdots λ_n = (−1)^n |A|$, and so $λ_1 \cdots λ_n = |A|$.

That is, the product of the eigenvalues of $A$ is $\det(A)$.

$\endgroup$
3
  • 7
    $\begingroup$ This question has been answered for a year now. It is fine to add an answer after a long time, but don't you think it should be different from the other answers? In particular, the accepted answer explains the same argument, and it has a math formatting. $\endgroup$ Commented Nov 5, 2014 at 18:41
  • 2
    $\begingroup$ @zarathustra your and others' complaint was only partly valid as Joey had definitely given more algebraic details as to how to prove the theorem; whereas the accepted answer didn't use the actual $p(\lambda)$ for any deduction and his statement of expanding the determinant along diagonal was also questioned by commenters. This answer actually at least serves me well and had I read this one first I'd have comfortably disregarded the accepted answer but not other way round. $\endgroup$
    – stucash
    Commented Feb 25, 2019 at 0:16
  • $\begingroup$ @zarathustra I don't think other answer use characteristic polynomial as like this answer do. Perhaps I found logic in it :) $\endgroup$
    – emonHR
    Commented May 18, 2019 at 12:07
0
$\begingroup$

A few places in this thread I noticed people raised issues about 'what if A doesn't have independent columns' or 'what if the determinant is 0'. I believe the following are all equivalent:

  • 0 is an eigenvalue of A
  • A has linearly dependent columns (or rows)
  • $det(A)=0$
  • $\prod_i \lambda _i = 0$ (the product of eigenvalues of A)

so we can take care of this issue by saying "Suppose one of the above is true. Then $det(A)=\prod_i \lambda _i = 0$, otherwise... (note it is clear that 0 being an eigenvalue results in the product being 0)

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .