146
$\begingroup$

I know that matrix multiplication in general is not commutative. So, in general:

$A, B \in \mathbb{R}^{n \times n}: A \cdot B \neq B \cdot A$

But for some matrices, this equations holds, e.g. A = Identity or A = Null-matrix $\forall B \in \mathbb{R}^{n \times n}$.

I think I remember that a group of special matrices (was it $O(n)$, the group of orthogonal matrices?) exist, for which matrix multiplication is commutative.

For which matrices $A, B \in \mathbb{R}^{n \times n}$ is $A\cdot B = B \cdot A$?

$\endgroup$
10
  • 17
    $\begingroup$ A sufficient condition is that $A$ and $B$ are simultaneously diagonalizable. $\endgroup$ Commented Jul 13, 2012 at 8:41
  • 1
    $\begingroup$ possible duplicate of Given a matrix, is there always another matrix which commutes with it? $\endgroup$ Commented Jul 13, 2012 at 8:57
  • 5
    $\begingroup$ $SO(2)$ is commutative, but that is more of an accident than a general property of (special) orthogonal groups. $\endgroup$ Commented Jul 13, 2012 at 10:50
  • 1
    $\begingroup$ @chaohuang if $A$ and $B$ commute, then for any eigenvalue $\lambda$, of $A$, we must have not only that the $\lambda$-eigenspace $\ker \lambda I-A$ is preserved by $B$ (because if $X$ and $Y$ commute and $Xv=0$, then $X(Yv)=Y(Xv)=0$), but also that the generalized eigenspace $\bigcup_k \ker (\lambda I-A)^k$ is also preserved by $B$. Because of this, you can decompose your vector space $V$ into joint eigenspaces $V_{\lambda \mu}$ where everything is a generalized $A$-eigenvector of eigenvalue $\lambda$ and a generalized $B$-eigenvector of eigenvalue $\mu$. From there it is a bit messy. $\endgroup$
    – Aaron
    Commented Jul 14, 2012 at 14:22
  • 1
    $\begingroup$ @chaohuang: (continued) However, this is enough to imply that if $A$ is diagonalizable with no repeated eigenvalues, then any matrix commuting with $A$ must also be simultaneously diagonalizable. However, even in the $2\times 2$ case, you get interesting results with repeated eigenvalues. It's clear what commutes with the identity matrix, but what commutes with $\pmatrix{1 & 1 \\ 0 & 1}$? $\endgroup$
    – Aaron
    Commented Jul 14, 2012 at 14:27

10 Answers 10

110
$\begingroup$

Two matrices that are simultaneously diagonalizable are always commutative.

Proof: Let $A$, $B$ be two such $n \times n$ matrices over a base field $\mathbb K$, $v_1, \ldots, v_n$ a basis of Eigenvectors for $A$. Since $A$ and $B$ are simultaneously diagonalizable, such a basis exists and is also a basis of Eigenvectors for $B$. Denote the corresponding Eigenvalues of $A$ by $\lambda_1,\ldots\lambda_n$ and those of $B$ by $\mu_1,\ldots,\mu_n$.

Then it is known that there is a matrix $T$ whose columns are $v_1,\ldots,v_n$ such that $T^{-1} A T =: D_A$ and $T^{-1} B T =: D_B$ are diagonal matrices. Since $D_A$ and $D_B$ trivially commute (explicit calculation shows this), we have $$AB = T D_A T^{-1} T D_B T^{-1} = T D_A D_B T^{-1} =T D_B D_A T^{-1}= T D_B T^{-1} T D_A T^{-1} = BA.$$

$\endgroup$
4
  • 8
    $\begingroup$ This is sufficient, but is it necessary? $\endgroup$
    – Victor Liu
    Commented Feb 12, 2015 at 7:05
  • 2
    $\begingroup$ It's not neccesary, AFAICT. I can't think of a counter-example right know, though. $\endgroup$ Commented Feb 12, 2015 at 10:28
  • 23
    $\begingroup$ Here's an example why this condition is not neccesary: Take any non-diagonalizable matrix $A$. It will always commute with the unit matrix $1$, but $A$ and $1$ are clearly not simultaneously diagonalizable. $\endgroup$ Commented Feb 22, 2015 at 13:35
  • $\begingroup$ The following link might be helpful for the other side: math.stackexchange.com/questions/236212/… $\endgroup$ Commented May 19, 2015 at 23:36
35
$\begingroup$

The only matrices that commute with all other matrices are the multiples of the identity.

$\endgroup$
5
  • 1
    $\begingroup$ What about elementary operations? $\endgroup$
    – rosstex
    Commented May 10, 2017 at 8:32
  • $\begingroup$ Not sure what you mean... $\endgroup$
    – Dirk
    Commented May 10, 2017 at 11:04
  • 2
    $\begingroup$ A proof of this fact? $\endgroup$
    – Xam
    Commented Oct 7, 2017 at 23:16
  • 1
    $\begingroup$ @Xam Good question - I don't know a slick proof off the top of my head, but this answer may be helpful. $\endgroup$
    – Dirk
    Commented Oct 9, 2017 at 9:25
  • 1
    $\begingroup$ @Xam math.stackexchange.com/questions/142967/… $\endgroup$
    – mdcq
    Commented Apr 10, 2018 at 17:59
18
$\begingroup$

Among the groups of orthogonal matrices $O(n,\mathbb R)$, only the case $n=0$ (the trivial group) and $n=1$ (the two element group) give commutative matrix groups. The group $O(2,\mathbb R)$ consists of plane rotations and reflections, of which the former form an index $2$ commutative subgroup, but reflections do not commute with rotations or among each other in general. The largest commutative subalgebras of square matrices are those which are diagonal on some fixed basis; these subalgebras only have dimension $n$, out of an available $n^2$, so commutation is really quite exceptional among $n\times n$ matrices (at least for $n\geq2$). Nothing very simple can be said that (non-tautologically) characterises all commuting pairs of matrices.

Added. In fact the statement above about the largest commutative subalgebra is false. If you take the set of matrices whose nonzero entries occur only in a block that touches the main diagonal (without containing any diagonal positions) then this is always a commutative subalgebra. And then you can still throw in multiples of the identity matrix. Thus there is for instance a commutative subalgebra of dimension $\lfloor\frac{n^2}4\rfloor+1$ inside $M_n(K)$, for every $n$, and $\lfloor\frac{n^2}4\rfloor+1>n$ for all $n>3$. See here.

$\endgroup$
8
$\begingroup$

The orthogonal matrices don't commute; in fact, there's a subspace of the orthogonals that's non-commutative!

Check that a permutation matrix is an orthogonal matrix (In case you don't know what a permutation matrix is, it's just a matrix $(a_{ij})$ such that a permutation $\sigma$ exists for which $a_{i,\sigma(i)}=1$ and $a_{ij}=0$ for $j\ne\sigma(i)$

Applying to a column vector $x$ the action of the permutation matrices is just permutation of the co-ordinates of $x$. But as we know the symmetry group is non-abelian. So just choose two non-commuting permutations and their corresponding matrices clearly don't commute!

$\endgroup$
5
$\begingroup$

Another commuting example:

ANY two square matrices that, are inverses of each other, commute.

      A B    =  I
inv(A)A B    =  inv(A)   # Premultiplying  both sides by inv(A)
inv(A)A B A  =  inv(A)A  # Postmultiplying both sides by A
        B A  =  I        # Canceling inverses

QED

There are lots of "special cases" that commute. The multiplication of two diagonal matrices, for example.

Aside: for any two square invertible matrices, A, B, there is something that can be said about AB vs. BA

If      AB  =  C
then    BA  =  inv(A) C A  =  B C inv(B)

(Proof: substitute AB for C in the result, and cancel inverses)

$\endgroup$
1
  • $\begingroup$ More generally, matrices which are polynomials of each other commute. The inverse can be regarded as a special case of polynomial as always we can express inverse as a polynomial. $\endgroup$
    – Widawensen
    Commented May 5, 2020 at 8:23
4
$\begingroup$

A particular case when orthogonal matrices commute.

Orthogonal matrices are used in geometric operations as rotation matrices and therefore if the rotation axes (invariant directions) of the two matrices are equal - the matrices spin the same way - their multiplication is commutative.

Intuitively, if you spin the globe first x degrees and then y degrees around the same axis you and up in the same position as you get by spinning it first y and the x degrees -> The multiplication of the rotation matrices describing the two rotations is commutable, it always yields the combined rotation.

$\endgroup$
2
$\begingroup$

All cyclic matrices of the same size n by n commute, each row is a cycle of the previous row.

For two variables, with identity, there are three basic types.

Complex or Elliptic

$\begin{bmatrix}x & y \\ -y & x\end{bmatrix}$

Dual (the ring of dual numbers)

$\begin{bmatrix}x & y \\ 0 & x\end{bmatrix}$

Hyperbolic (also cyclic)

$\begin{bmatrix}x & y \\ y & x\end{bmatrix}$

Each can be represented also as a "commutative ring number" $x+ty$ for $tt=-1,0,1$ respectively... associated with their eigenvalues.

$\endgroup$
0
$\begingroup$

If the two matrices have Jordan Normal Forms which have the same block structure. Multiplication of blocks will give diagonal $\lambda_1\lambda_2$, first off-diagonal $\lambda_1 + \lambda_2$ and second off-diagonal $1$ so assuming scalar multiplication and addition is commutative so will the jordan blocks.

$\endgroup$
5
  • $\begingroup$ What do you want to say ? What is the relationship with the question? $\endgroup$
    – user91684
    Commented Aug 13, 2015 at 20:00
  • $\begingroup$ The answer I wanted to contribute is : If $\bf A$ and $\bf B$ have the same Jordan normal form they will commute as all their individual Jordan blocks will commute. This is more general than that they need to be simultaneously diagonalizable. $\endgroup$ Commented Aug 14, 2015 at 8:58
  • 2
    $\begingroup$ Beware, two matrices may commute with different Jordan block structures. $\endgroup$
    – user91684
    Commented Aug 14, 2015 at 10:37
  • $\begingroup$ Hmm, yes. I was not specific enough. Sorry. $\endgroup$ Commented Aug 14, 2015 at 14:28
  • $\begingroup$ Having the same JNF is (not necessary, and) absolutely not sufficient for commutation, even if these are in fact diagonal forms. Two reflections have the same diagonal form (with one diagonal entry $-1$ and the rest $1$) but most often they do not commute. $\endgroup$ Commented Oct 8, 2020 at 7:54
-2
$\begingroup$

for two matrix to show commutativity the necessary and sufficient condition is that they should share all of their eigenvectors, that's it. whether they are diagnolizable or not is immaterial. for example check out following matrices for commutativity and diagnolizabilty. A = [6 -1;1 4] and B = [3 2;-2 7]

both A and B matrices are commutative but they are not diagnalizable however they share their eigenvector.in this case they both have one line of eigenvector

the necessary andsuficient condition which I just mentioned can be proved easily.

$\endgroup$
1
  • 5
    $\begingroup$ You are wrong. $diag(1,-1)$ and $I_2$ commute ; yet, they do not share their eigenvectors. $\endgroup$
    – user91684
    Commented Aug 13, 2015 at 15:08
-6
$\begingroup$

So there is no group of Matrix pairs that commute. It is $$ AB = BA $$ if and only if there is a polynomial $$ p \in \mathbb{R}[x] $$ such that $$ p(A)=B. $$ This can be proven using Jordan Normalform or by simple computing.

$\endgroup$
3
  • 5
    $\begingroup$ What does "there is no group of matrix pairs that commute" mean? $\endgroup$
    – user23211
    Commented Jul 13, 2012 at 10:00
  • 5
    $\begingroup$ There is a small bit of truth here. IIRC, over an algebraically closed field, every matrix that commutes with $A$ is a polynomial in $A$ if and only if every eigenvalue of $A$ has geometric multiplicity $1$. $\endgroup$ Commented Jul 13, 2012 at 15:38
  • 1
    $\begingroup$ Yet in general this answer is wrong. E.g. see the counterexample composed of nilpotent matrices in the answer by Marc van Leeuwen in the thread "Can commuting matrices X,Y always be written as polynomials of some matrix A?" $\endgroup$
    – user1551
    Commented Feb 12, 2015 at 10:28

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .