11
$\begingroup$

Diagonal matrices and diagonalizability are key topics in linear algebra as well as numerical linear algebra. Likewise, symmetric matrices have lots of nice properties that make them widely studied and important both theoretically and computationally.

However, anti-diagonal matrices seem to be no more than a curiosity in matrix algebra. While symmetry along the main diagonal seems to count for so much, persymmetry does not seem to count for very much at all.

Is there a reason for this? After all (and this might sound naive) why should one diagonal (left to right) matter so much more than the other one? Is this an artifact / convention arising from the development of matrix algebra or does it reflect something deeper.

Or, are anti-diagonal and per-symmetric matrices of far greater importance than I think?

I was thinking about this and was not really able to come up with anything close to a satisfactory answer.

$\endgroup$

2 Answers 2

7
$\begingroup$

I don't know how satisfying this answer will be, but I'll give it a shot anyway. The punchline, I think, is that although these "diagonal properties" have just as much aesthetic appeal as their "anti-diagonal" counterparts, the diagonal properties happen to give us information that is more useful for a matrix as it is used mathematically. That is, diagonal symmetry is a more natural thing to look for in the context of linear algebra.

First of all, note that all of these properties are properties of square (that is, $n \times n$) matrices, which are implicitly linear maps from $\Bbb F^n$ to $\Bbb F^n$ (that is, they produce vectors of $n$ entries from vectors of $n$ entries).

The properties that we really care about in linear algebra are the ones that tell us something about how matrices interact with vectors (and ultimately, with other matrices).

Diagonal Matrices

Diagonal matrices are important because they describe a particularly nice class of linear transformations. In particular: $$ \pmatrix{d_1\\&d_2\\&&\ddots \\ &&& d_n} \pmatrix{x_1\\ x_2\\ \vdots \\ x_n} = \pmatrix{d_1 x_1\\ d_2 x_2\\ \vdots \\ d_n x_n} $$ I would say that what a diagonal matrix represents is the fact that each of the $n$ variables required to specify a vector are decoupled. For example, in order to find the new $x_2$, one only needs to look at the old $x_2$, and do what the matrix says.

When we "diagonalize" a matrix, we're finding a way to describe each vector (that is, $n$ independent "pieces of information") that are similarly decoupled as far as the transformation is concerned. So, for example, the matrix $$ A = \pmatrix{0&1\\4&0} $$ takes a vector $x = (x_1,x_2)$ and produces a new vector $Ax = (x_2,2x_1)$. There's a nice symmetry to that; in particular, applying $A$ twice gives us the vector $A^2 x = (2x_1,2x_2)$, which is to say that $A$ acts like a diagonal matrix whenever you apply it an even number of times.

However, I would argue that we get a clearer picture of what $A$ does if we diagonalize it. In particular, if we write a vector as $x = a_1(1,2) + a_2(1,-2)$, $A$ gives us the new vector $$ Ax = a_1 A(1,2) + a_2 A(2,1) = 2a_1(1,2) - 2a_1(1,-2) $$ In particular, one we know the two pieces of information $a_1$ and $a_2$, we can figure out the new vector using these pieces separately, without having them interact.

So, we see that this antidiagonal $A$ is nice, but just not nearly as simple as the "diagonal version" of the transformation.

Symmetric matrices

Symmetric matrices are particularly nice when we care about dot-products. Dot products are needed whenever you want to think about the angle between vectors in some capacity.

In particular: if we define the dot-product $$ (x_1,\dots,x_n) \cdot (y_1,\dots,y_n) = x_1y_1 + \cdots x_n y_n $$ Then a symmetric $A$ will have the property that $$ (Ax) \cdot y = x \cdot (Ay) $$ Ultimately, this whole thing connects back to diagonal matrices since every symmetric matrix can be diagonalized in the sense described above. The fact that this can be done is known as the spectral theorem.

Persymmetric matrices, however, don't act in a particularly nice way with respect to any usual operations (like the dot product).

$\endgroup$
4
  • $\begingroup$ Thanks: The fact that persymmetric matrices do not act on the dot product in a nice, "self-adjoint" way is surely part of the story. However, perhaps there is more to this: could an analogous (linear, commutative) anti-dot operator $x_1 y_n + x_2 y_{n-1} + ... + x_n y_1$ be defined so that persymmetric matrices would act on it in a similar way? In any case, from your first example, it seems that basic matrix multiplication favors the diagonal over the anti-diagonal. $\endgroup$ Commented Jun 3, 2016 at 19:36
  • $\begingroup$ Actually, yes! Your anti-dot operator does exactly what you claim it does (that is, persymmetrics are "self-adjoint" with respect to this bilinear form). However, this bilinear form is "less useful"; it certainly doesn't have the same kind of immediate geometric interpretation as the dot product. $\endgroup$ Commented Jun 3, 2016 at 19:59
  • 1
    $\begingroup$ And yes, the fact that diagonal matrices are interesting comes from the way that basic matrix multiplication is defined. $\endgroup$ Commented Jun 3, 2016 at 19:59
  • $\begingroup$ In fact, your bilinear form is a reasonable way to think of certain Minkowski spaces. $\endgroup$ Commented Jun 3, 2016 at 20:00
3
$\begingroup$

$\newcommand{\dd}{\partial}$If $A$ is an $n \times n$ matrix, and if $x$ and $y$ are columns satisfying $y = Ax$, then the $(i, j)$-entry of $A$ measures the "sensitivity" of $y_{i}$ to changes in $x_{j}$. Precisely, if every variable except $x_{j}$ is held constant, then a change of $\Delta x_{j}$ in $x_{j}$ changes $y_{i}$ by $A_{ij}y_{i}$. (This is immediate from the definition of matrix multiplication, and explains why the derivative matrix of a vector-valued function of $n$ variables has the partial derivative $\dd y_{i}/\dd x_{j}$ in the $(i, j)$ entry.)

To say $A$ is diagonal is to say $y_{i}$ depends only on $x_{i}$, the variable with the same index. This condition is invariant under relabeling variables. By contrast, anti-diagonality is not invariant under relabeling.

I don't know of a similarly naive reason that symmetry is so important, but naturally the spectral theorem for matrices guarantees a symmetric matrix is orthogonally diagonalizable; that is, up to a rotational change of variables, $A$ is diagonal.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .