57
$\begingroup$

Just wanted some input to see if my proof is satisfactory or if it needs some cleaning up.

Here is what I have.


Proof

Suppose $A$ is square matrix and invertible and, for the sake of contradiction, let $0$ be an eigenvalue. Consider, $(A-\lambda I)\cdot v = 0$ with $\lambda=0 $ $$\Rightarrow (A- 0\cdot I)v=0$$
$$\Rightarrow(A-0)v=0$$
$$\Rightarrow Av=0$$

We know $A$ is an invertible and in order for $Av = 0$, $v = 0$, but $v$ must be non-trivial such that $\det(A-\lambda I) = 0$. Here lies our contradiction. Hence, $0$ cannot be an eigenvalue.

Revised Proof

Suppose $A$ is square matrix and has an eigenvalue of $0$. For the sake of contradiction, lets assume $A$ is invertible.

Consider, $Av = \lambda v$, with $\lambda = 0$ means there exists a non-zero $v$ such that $Av = 0$. This implies $Av = 0v \Rightarrow Av = 0$

For an invertible matrix $A$, $Av = 0$ implies $v = 0$. So, $Av = 0 = A\cdot 0$. Since $v$ cannot be $0$,this means $A$ must not have been one-to-one. Hence, our contradiction, $A$ must not be invertible.

$\endgroup$
2
  • $\begingroup$ Thanks for all the input everyone. $\endgroup$
    – Derrick J.
    Commented Apr 16, 2014 at 3:56
  • $\begingroup$ Added [proof-verification] tag. That or [solution-verification] is used for requests to check correctness of an argument. $\endgroup$
    – zyx
    Commented Apr 17, 2014 at 1:23

10 Answers 10

35
$\begingroup$

Your proof is correct. In fact, a square matrix $A$ is invertible if and only if $0$ is not an eigenvalue of $A$. (You can replace all logical implications in your proof by logical equivalences.)

Hope this helps!

$\endgroup$
4
  • 2
    $\begingroup$ But this answer is not quite correct. While the implications in the question can be made into equivalences to show that "$A$ has $0$ as eigenvalue" is equivalent to "there exists a nonzero $v$ with $Av=0$" (in fact I would say that is what the definition of eigenvalue amounts to), the latter means "(the linear map with matrix) $A$ is not injective" which is not the same a "$A$ is not invertible". It is true that for square matrices the linear map is injective if and only if $A$ is invertible, but this is a theorem whose proof is a bit more complicated. $\endgroup$ Commented Apr 16, 2014 at 8:10
  • 2
    $\begingroup$ Dear @Marc, the hypothesis of the question is that $A$ is a square matrix. I guess you could argue that "for square matrices $A$, invertibility is equivalent to injectivity (as a linear operator)" is non-trivial but that doesn't invalidate my answer. Anyway, when I'm used to thinking about something for a long time, my brain identifies logically (but non-trivially) equivalent statements as literally the same :) $\endgroup$ Commented Apr 16, 2014 at 8:50
  • 1
    $\begingroup$ The 1-by-1 matrix over $\mathbb{R}\lbrack x\rbrack$ whose entry is $x$, and the $\mathbb{N}$-by-$\mathbb{N}$ matrix with ones on the subdiagonal $\hspace{.45 in}$ and zeros everywhere else, are both non-invertible matrices that do not have 0 as an eigenvalue. $\hspace{.66 in}$ $\endgroup$
    – user57159
    Commented Apr 17, 2014 at 0:09
  • 1
    $\begingroup$ Dear @Ricky, thanks for your comment! It was implicit that we were referring to square matrices with entries in a field throughout the above discussion. I agree that injective self-maps need not be bijective in general! Infinite-dimensional math is hard :) $\endgroup$ Commented Apr 17, 2014 at 1:24
26
$\begingroup$

This looks okay. Initially, we have that $Av=\lambda v$ for eigenvalues $\lambda$ of $A$. Since $\lambda=0$, we have that $Av=0$. Now assume that $A^{-1}$ exists.

Now by multiplying on the left by $A^{-1}$, we get $v=0$. This is a contradiction, since $v$ cannot be the zero vector. So, $A^{-1}$ does not exist.

$\endgroup$
0
18
$\begingroup$

If $\lambda_1,\dotsc,\lambda_n$ are the (not necessarily distinct) eigenvalues of an $n\times n$ matrix $A$, then $$ \det(A)=\lambda_1\dotsb\lambda_n\tag{1} $$ A nice proof of this fact can be found here.

Now, $A$ is invertible if and only if $\det(A)\neq0$. Hence $(1)$ implies $A$ is invertible if and only if $0$ is not an eigenvalue of $A$.

$\endgroup$
6
$\begingroup$

Let $A$ have an eigenvalue of $0$. By definition, there is a non-trivial vector $v$ such that $Av = 0v = 0$. Let $B$ be any matrix. Then $BA$ is not the identity matrix, because $(BA)v = B (Av) = B0 = 0.$ This is true for any matrix $B$, so $A$ is not invertible.

$\endgroup$
4
$\begingroup$

Your proof works but I think could be cleaned up a bit. For example the sentence "We know A is an invertible and in order for A v=0; v=0" could be stated "For an invertible matrix $A$, $Av=0 \Rightarrow v=0$". You may need to justify that claim, perhaps by referring to an earlier result or just by pointing out that the map $v \rightarrow Av$ is injective and $A0 = 0$.

Here's what I consider a clean proof from first principles. I use a very slightly different definition of an eigenvalue from yours, $Av = \lambda v$ rather than your $(A - \lambda I)v = 0$, but you've already proved that yours implies mine in your answer. I've also assumed basic matrix/vector arithmetic without remarking on it: associativity, the behaviour of the zero vector, multiplication of a vector by a 0 scalar. All that stuff generally precedes the definition of eigenvalues!


Suppose $0$ is an eigenvalue of $A$. That is, $\exists v(v\neq0 \wedge Av = 0)$

Then for any matrix $B$,

$(BA)v = B(Av) = B0 = 0$

that is, $(BA)v \neq v$

so $BA$ is not the identity, so $B$ is not the inverse of $A$.

Hence $A$ is not invertible.

$\endgroup$
4
$\begingroup$

Your work is fine, but here's a little more to think about while moving forward in this subject.

Multiplication by a square $n*n$ matrix produces a linear transformation (eg, translate, rotate, enlarge/shrink) in $n$ dimensions. If an eigenvalue is $0$, then that means that along the corresponding eigenvector, the transformation scales the image down to zero - eg, it multiplies all measures across that vector by $0$.

Does multiplication by $0$ have an inverse?

$\endgroup$
4
$\begingroup$

Not sure how picky your Prof is about proofs, but I think your initial statement is backwards. The question is asking whether A is invertible given that it has an eigenvalue of 0. So you are trying to prove that, "If a square matrix A has an eigenvalue of 0, then A is NOT invertible." Thus, for the sake of contradiction you want to assume that A is invertible rather than assuming that 0 is an eigenvalue. We are given that A has an eigenvalue of 0 so that is not what you are assuming for the sake of contradiction. Then at the end say something like, "Thus, 0 is not an eigenvalue of A. But if 0 is not an eigenvalue of A then the assumption, 'A is invertible' is false, so A is NOT invertible. Q.E.D."

$\endgroup$
0
3
$\begingroup$

Your way is basically along the lines of how I would do it:

Let $A$ be an $n \times n$ matrix, and assume for the sake of contradiction that $A$ is invertible. By your work, since $0$ is an eigenvalue of $A$, then we know there exists a nonzero vector $\mathbf{v}$ such that $A\mathbf{v} = 0$.

$\Rightarrow dim(N(A)) > 0$, where $N(A)$ represents the null space of $A$.

Remember that, by the rank-nullity theorem, $dim(N(A)) + Rank(A) = n$.

Since $dim(N(A)) > 0$, then $Rank(A) < n$. Herein lies the contradiction. An $n \times n$ matrix is invertible $\iff$ it's rank is $n$.

$\endgroup$
2
$\begingroup$

Your proof that "$A$ has an eigenvalue $0$" implies "$A$ is not invertible" is basically correct. I find it a bit hard to read though, and I really don't see why one should invoke a determinant at the end. I think you could streamline it as follows.

  • $A$ has an eigenvalue $0$ means there exists a nonzero vector $v$ with $A\cdot v=0v$;
  • then $A\cdot v=0=A\cdot 0$,
  • which shows (since $v\neq0$) that $A$ is not injective (one-to-one), so it is certainly not invertible.
$\endgroup$
2
$\begingroup$

Suposse that 0 is an eigenvalue of A.

Considering $J$ the Jordan canonical form of A, you have a triangular matrix with the eigenvalues in the diagonal, with the same determinant than $A$.

It's known that determinant of a triangular matrix is the product of the elements in the diagonal and we've supossed that one of them is 0, then is clear that $Det(A)=Det(J)=0$, so $A$ isn't invertible.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .