9
$\begingroup$

Let $J$ be a $k \times k$ jordan block, prove that any matrix which commutes with $J$ is a polynomial in $J$.

I appreciate your hints, Thanks

$\endgroup$
6
  • 1
    $\begingroup$ quite a distance to the full result, which is more than you asked. Do the 2 by 2 and 3 by 3 cases by hand, you will learn something. Actually, the 1 by 1 case as well... $\endgroup$
    – Will Jagy
    Commented Aug 18, 2014 at 19:03
  • $\begingroup$ Thanks, I can prove (doing case by case) that All powers of $J$ commute with $J$, but that does not prove anything... should I assume that some $2 \times 2$ matrix which is not a polynomial in $J$ does not commute with $J$ ? $\endgroup$
    – the8thone
    Commented Aug 18, 2014 at 19:08
  • $\begingroup$ You will do what you want, I suppose. I suggest you write out a 2 by 2 Jordan block, call the eigenvalue e or something, write a general 2 by 2 matrix with entries a,b,c,d, multiply in both orders and see what conditions make them commute. Because of Cayley-Hamilton, for 2 by 2 any polynomial need only be linear, $M = A I + B J$ $\endgroup$
    – Will Jagy
    Commented Aug 18, 2014 at 19:17
  • $\begingroup$ Let's see, for 3 by 3 9 entries, maybe a,b,c,d,e,f,g,h,i, one eigenvalue, maybe w, and the polynomials to be considered need only be quadratic, $AI + B J + C J^2.$ Anyway, I think you are a bit over your head, and hands-on manipulation of concrete examples, small enough to be done by a human being, is going to teach you more than me parroting some proof. The very best outcome is if the concrete examples lead you to your own proof. $\endgroup$
    – Will Jagy
    Commented Aug 18, 2014 at 19:24
  • 1
    $\begingroup$ Another easy way to prove the result (well depending on how much theory you know) is the fact that each Jordan block necessarily has a cyclic vector since the minimal and characteristic polynomials for a Jordan block are equal. This in turn implies (via the cyclic vector theorem) that each matrix which commutes with the $J$ is a polynomial in $J$. $\endgroup$
    – EuYu
    Commented Aug 18, 2014 at 20:11

2 Answers 2

11
$\begingroup$

One direction of the equivalence is easy. For any polynomial $p$ and any square matrix $A$, $p(A)A=Ap(A)$.

For the other direction, we can assume that the $k\times k$ Jordan block $J_k$ has zeros on the diagonal. Indeed, any other Jordan block can be written in the form $\beta I + J_k$ and it is easy to see that $A$ commutes with $\beta I+J_k$ if and only if it commutes with $J_k$.

Let $A=(a_{ij})$ be a $k\times k$ matrix and assume that $AJ_k-J_kA=0$. We first show that the this implies that $A$ is an upper triangular (UT) and Toeplitz (T) matrix and proceed by induction on $k$. So assume that if a $(k-1)\times (k-1)$ matrix commutes with a $(k-1)\times (k-1)$ Jordan block, it is UT&T. Let's write $J_k$ and $A$ in the partitioned form $$ A=\begin{bmatrix}A_{11}&a_{12}\\a_{21}^*&\alpha_{22}\end{bmatrix}, \quad J_k=\begin{bmatrix}J_{k-1}&e_{k-1}\\0&0\end{bmatrix}, $$ where $A_{11}$ and $J_{k-1}$ are $(k-1)\times(k-1)$. Let's have a look on the commutator: $$ \begin{split} AJ_k-J_kA&=\begin{bmatrix}A_{11}&a_{12}\\a_{21}^*&\alpha_{22}\end{bmatrix}\begin{bmatrix}J_{k-1}&e_{k-1}\\0&0\end{bmatrix}-\begin{bmatrix}J_{k-1}&e_{k-1}\\0&0\end{bmatrix}\begin{bmatrix}A_{11}&a_{12}\\a_{21}^*&\alpha_{22}\end{bmatrix}\\ &=\begin{bmatrix}A_{11}J_{k-1}-J_{k-1}A_{11}-e_{k-1}a_{21}^*&A_{11}e_{k-1}-J_{k-1}a_{12}-\alpha_{22}e_{k-1}\\a_{21}^*J_{k-1}&a_{21}^*e_{k-1} \end{bmatrix}\\ &=\begin{bmatrix}A_{11}J_{k-1}-J_{k-1}A_{11}-e_{k-1}a_{21}^*&(A_{11}-\alpha_{22}I)e_{k-1}-J_{k-1}a_{12}\\a_{21}^*J_{k-1}&a_{21}^*e_{k-1} \end{bmatrix}. \end{split} $$

Since $AJ_k-J_kA=0$ by assumption, the last two block rows imply that $$ a_{21}^*e_{k-1}=0, \quad a_{21}^*J_{k-1}=0. $$ The first equation implies that the last entry of $a_{21}=0$ is zero, while the other says that the first $k-2$ entries of $a_{21}$ are zero. Therefore, $a_{21}=0$ and indeed the matrix $A$ is UT (since by the induction assumption, $A_{11}$ is UT).

The first block of $AJ-JA=0$ implies that $$ A_{11}J_{k-1}-J_{k-1}A_{11}-e_{k-1}a_{21}^*=A_{11}J_{k-1}-J_{k-1}A_{11}=0 $$ because we already showed that $a_{21}=0$ and hence from $A_{11}J_{k-1}-J_{k-1}A_{11}=0$ we have that $A_{11}$ is UT&T (by the induction assumption).

It remains to show that if $A_{11}$ is T then from $(A_{11}-\alpha_{22}I)e_{k-1}-J_{k-1}a_{12}=0$ we have that $A$ is T. We have $$ (A_{11}-\alpha_{22}I)e_{k-1} = \begin{bmatrix} a_{1,k-1}\\ \vdots\\ a_{k-2,k-1}\\ a_{k-1,k-1}-\alpha_{22} \end{bmatrix}, \quad J_{k-1}a_{12}=\begin{bmatrix} a_{2,k}\\ \vdots\\ a_{k-1,k}\\ 0 \end{bmatrix}. $$ Since both vectors are equal, it gives $a_{i,k-1}=a_{i+1,k}$ for $i=1,\ldots,k-2$ and $\alpha_22=a_{k-1,k-1}$. Since $A_{11}$ is UT&T, the diagonal of $A_{11}$ is constant and thus the diagonal of $A$ is constant as well and the last $k-2$ entries of $a_{12}$ are "copies" of the first $k-2$ entries of the last column of $A_{11}$. Therefore, $A$ is T.

Summarizing, $A$ and $J_k$ commute implies that $A$ is a UT&T matrix and consequently it can be written in the form $$ A=\begin{bmatrix} a_1 & a_2 & \ldots & a_k \\ & a_1 & \ldots & a_{k-1} \\ & & \ddots & \vdots \\ & & & a_1 \end{bmatrix}. $$ By realizing how the powers of $J_k$ look like, it is easy to see that $A$ can be written as $$ A=a_1 J_k^0 + a_2 J_k^1 + \ldots + a_k J_k^{k-1} =: p(J_k), \quad p(t)=a_1+a_2t +\ldots+a_k t^{k-1}. $$

Q.E.D.

$\endgroup$
2
$\begingroup$

2 by 2 only. i really hope you will try the 3 by 3 case by hand, analogous to this:

Aright, eigenvalue $w,$ $$ J = \left( \begin{array}{rr} w & 1 \\ 0 & w \end{array} \right), $$ trial $$ M = \left( \begin{array}{rr} a & b \\ c & d \end{array} \right), $$

Next $$ JM = \left( \begin{array}{rr} wa + c & wb+d \\ wc & wd \end{array} \right), $$ but $$ MJ = \left( \begin{array}{rr} wa & wb+a \\ wc & wd+ c \end{array} \right). $$

We find that $JM = MJ$ precisely when $$ c = 0 \; \; \mbox{AND} \; \; a=d. $$ Notice that $w$ does not appear. Under these conditions, $$ M = \left( \begin{array}{rr} a & b \\ 0 & a \end{array} \right). $$ That means that $$ M = (a-bw)I + b J, $$ that is a polynomial in $J.$

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .