6
$\begingroup$

The question is: The square matrix $A$ satisfies $p(A) = 0$, where $p(x)$ is a polynomial such that $p(0) \ne 0$. Show that $A$ is invertible.

I'm lost, I don't know if there's something more I have to learn to do this. I've gotten this far (I'm most likely not on the right track):

$$ p(A) = a_0I+a_1A+a_2A^2+ ...+a_nA^n $$ $$ p(0) = a_0I+(a_1\cdot 0)+(a_2\cdot 0^2)+\ldots +(a_n\cdot 0^n) $$ $$ p(0) = a_oI$$ $$ p(A) = p(0)+a_1A+a_2A^2 +\ldots +a_nA^n $$

I don't quite know what to do further. I know that if $AX=B$, where $A$ is the square matrix, $B$ is a matrix vector, if there's only one solution $X$ for all $B$, then $A$ is invertible.

$\endgroup$
2
  • $\begingroup$ @A.G. No, but I'm going to google it. $\endgroup$
    – E.Bob
    Commented Nov 9, 2016 at 20:14
  • $\begingroup$ Forget it, the answer below is much better. $\endgroup$
    – A.Γ.
    Commented Nov 9, 2016 at 20:15

5 Answers 5

30
$\begingroup$

If $p(0)$ is nonzero then $a_0$ is nonzero. Therefore, one has: $$I=-\sum_{i=1}^n\frac{a_i}{a_0}A^i=-A\sum_{i=0}^{n-1}\frac{a_{i+1}}{a_0}A^i.$$

$\endgroup$
2
  • $\begingroup$ Shouldn't that be $a_{i+1}$ in your last fraction? $\endgroup$
    – Malloc
    Commented Nov 10, 2016 at 5:46
  • $\begingroup$ It should indeed! Thank you! $\endgroup$
    – C. Falcon
    Commented Nov 10, 2016 at 6:36
6
$\begingroup$

There's a simple way to do this without manipulating an expansion of the polynomial, or knowing anything about determinants or characteristic polynomials.

First, recall that $A$ is non-invertible if and only if there exists some non-zero vector $\bf{x}$ such that $A{\bf x} = {\bf 0}$. Suppose that there exists such a vector.

Since $p(A)$ is a sum of matrices ($a_n A^n + \dots + a_1 A + a_0 I$) we may compute ${\bf x}^T p(A) \bf{x}$. The fact that $A {\bf x} = 0$ implies that ${\bf x}^T A^n {\bf x} = {\bf 0}$ for any $n>0$, so we have that ${\bf x}^T p(A) {\bf x} = {\bf x}^T a_0 {\bf x} = p(0) \left| {\bf x} \right|^2$. Since we are told that $p(A) = 0$, we have that $p(0) \left| {\bf x} \right|^2 = 0$. Since ${\bf x}$ is non-zero, we must have $p(0) = 0$, contradicting the assumption that $p(0) \ne 0$.

To summarise: we are given that $p(A) = 0$ and that $p(0) \ne 0$. If $A$ is not an invertible matrix, then the argument of the previous two paragraphs shows that the conditions $p(A) = 0$ and $p(0) \ne 0$ cannot both hold. So $A$ must be invertible.

$\endgroup$
1
  • $\begingroup$ I really like this answer. It's always cool to come up with a solution to a problem that requires a minimum amount of vocabulary (or in this case, "math tricks"). This is much more akin to the first solution you would come up with as you're learning this material. $\endgroup$ Commented Nov 10, 2016 at 9:46
5
$\begingroup$

$p(0)\neq 0$ implies that $0$ is not a root of characteristic polynomial $p(x)$ which is turn says that $0$ is not an eigenvalue of $A$. As $det(A)=$ product of eigenvalues gives $det(A)\neq 0$ which suggests that $A$ is invertible.

$\endgroup$
2
$\begingroup$

Well,

if $p(x)=a_n x^n + a_{n-1} x^{n-1} + \cdots + a_0$ is the characteristic polynomial then it is known from theory that $a_0$ represents the determinant of the matrix. Since $p(0)=a_0 \neq 0$ this means that $\det A \neq 0$ . Thus $A$ is invertible.

$\endgroup$
2
  • $\begingroup$ I haven't learned determinants yet, but it looks easier to understand by doing it that way. $\endgroup$
    – E.Bob
    Commented Nov 9, 2016 at 20:51
  • 7
    $\begingroup$ $p(x)$ is not necessarily the characteristic polynomial, in this exercise. $\endgroup$ Commented Nov 9, 2016 at 23:16
1
$\begingroup$

All eigenvalues of A must be roots of P.

(See the eigenvector X and P(A)X)

=> all eigennumbers of A is not zero => |A| is not zero => A is invertible

$\endgroup$
1
  • $\begingroup$ I haven't learned eigennumbers and eigenvectors yet, but thank's for future answer. $\endgroup$
    – E.Bob
    Commented Nov 9, 2016 at 20:52

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .