Why if the columns of a matrix are not linearly independent the matrix is not invertible?
I have watched this video about eigenvalues and eigenvectors by Sal from Khan Academy, where he says that for $\lambda$ to be an eigenvalue for the matrix $A$, the following must be true
$$A \cdot \vec{v} = \lambda \cdot \vec{v} \\ \vec{0} = \lambda \cdot \vec{v} - A \cdot \vec{v} \\ \vec{0} = (\lambda - A )\cdot \vec{v} \\ \vec{0} = (\lambda \cdot I - A )\cdot \vec{v}$$
and the determinant of $(\lambda \cdot I - A )$ must be $0$, or in other words $(\lambda \cdot I - A )$ is not invertible, or in other words the columns of $(\lambda \cdot I - A )$ are linearly dependent, or the nullspace of $(\lambda \cdot I - A )$ is non trivial.
Could someone explain me better these statements? What's the relation between a statement and the other?
I understood some stuff, but some other clarifications might help too.