3
$\begingroup$

Why if the columns of a matrix are not linearly independent the matrix is not invertible?

I have watched this video about eigenvalues and eigenvectors by Sal from Khan Academy, where he says that for $\lambda$ to be an eigenvalue for the matrix $A$, the following must be true

$$A \cdot \vec{v} = \lambda \cdot \vec{v} \\ \vec{0} = \lambda \cdot \vec{v} - A \cdot \vec{v} \\ \vec{0} = (\lambda - A )\cdot \vec{v} \\ \vec{0} = (\lambda \cdot I - A )\cdot \vec{v}$$

and the determinant of $(\lambda \cdot I - A )$ must be $0$, or in other words $(\lambda \cdot I - A )$ is not invertible, or in other words the columns of $(\lambda \cdot I - A )$ are linearly dependent, or the nullspace of $(\lambda \cdot I - A )$ is non trivial.

Could someone explain me better these statements? What's the relation between a statement and the other?

I understood some stuff, but some other clarifications might help too.

$\endgroup$

4 Answers 4

6
$\begingroup$

Here's an answer that completely avoids determinants. Determinants are heinously overrated.

Let $\vec v_1,\vec v_2,\dotsc,\vec v_n$ be the columns of a matrix $A$. That is, $$ A= \begin{bmatrix} \vec v_1 & \vec v_2 & \dotsb & \vec v_n \end{bmatrix} $$ Now, suppose the columns of $A$ are not linearly independent. Then there exist scalars $\lambda_1,\lambda_2,\dotsc,\lambda_n$ not all zero such that $$ \lambda_1\vec v_1+\lambda_2\vec v_2+\dotsb+\lambda_n\vec v_n=\vec 0\tag{1} $$ But (1) may be re-written in matrix form as $$ \begin{bmatrix} \vec v_1 & \vec v_2 & \dotsb & \vec v_n \end{bmatrix} \begin{bmatrix} \lambda_1\\ \lambda_2 \\ \vdots\\ \lambda_n \end{bmatrix} =\vec 0 $$ Putting $$ \vec \lambda= \begin{bmatrix} \lambda_1\\ \lambda_2 \\ \vdots\\ \lambda_n \end{bmatrix} $$ then gives $A\vec\lambda=\vec 0$ where $\vec \lambda\neq\vec 0$. Hence $A$ has a nontrivial nullspace and is thus not invertible.

$\endgroup$
3
$\begingroup$

If a matrix has columns linearly dependent. Then determinant of transpose of the matrix will be zero.Which means determinant of the given matrix is also zero. A matrix is invertible only if its determinant is non-zero.

$\endgroup$
1
$\begingroup$

Suppose $A$ is a $n \times n$ invertible real matrix and $x \in \mathbb{R}^n$. The vector $A^{-1}x$ describes the unique representation of $x$ as a linear combination of the columns of $A$. Specifically, $x=\sum_{i=1}^n (A^{-1} x)_i a_i$. If the columns are not linearly independent, then this representation cannot be unique: given $\sum_{i=1}^n b_i a_i = 0$ (where at least one $b_i$ is not zero) and $\sum_{i=1}^n c_i a_i = x$, you will have $\sum_{i=1}^n (b_i+c_i) a_i = x$. So there will always be at least two representations, and so $A$ cannot be invertible.

In other words:

  1. A $n \times n$ real matrix is invertible if and only if every $x \in \mathbb{R}^n$ has a unique representation as a linear combination of the columns of $A$
  2. This representation cannot be unique if zero can be nontrivially represented as a linear combination of the columns of $A$, i.e. if the columns of $A$ are not linearly independent.
$\endgroup$
0
1
$\begingroup$

If $A-\lambda \cdot I$ is invertible, then the only one solution to the equation $(A-\lambda \cdot I) \cdot \vec{v} = \vec{0}$ is $\vec{v} = \vec{0}$.

In fact, suppose $(A-\lambda \cdot I)^{-1}$ exists, then if we multiply both sides by the inverse:

$$(A-\lambda \cdot I)^{-1} \cdot (A-\lambda \cdot I) \cdot \vec{v} = (A-\lambda \cdot I)^{-1} \cdot \vec{0}$$

$$I \cdot \vec{v} = (A-\lambda \cdot I)^{-1} \cdot \vec{0}$$

$$\vec{v} = (A-\lambda \cdot I)^{-1} \cdot \vec{0}$$

$$\vec{v} = \vec{0}$$

But eigenvectors are non-zero, so one must require: $$\det(A-\lambda \cdot I) = 0$$ i.e. the matrix $A-\lambda \cdot I$ is not invertible.

$\endgroup$
1
  • 1
    $\begingroup$ Assume that $A-\lambda .I$ is invertible and let $B$ be its inverse then :$$ (A-\lambda .I)\vec{v} = \vec{0} \Leftrightarrow \vec{v} = B .\vec{0} = \vec{0}.$$ $\endgroup$
    – Joelafrite
    Commented May 16, 2015 at 22:28

You must log in to answer this question.