2
$\begingroup$

$$A=\left( \begin{array}{ccc} 0 & c & b \\ c & 0 & a \\ b & a & 0 \end{array} \right) $$

In my work, I encounter the above matrix. But I simply need: if $Ax=0$ and the vector $x$ has no zero entry, then $A=0$.

Just for curiosity, does the above matrix have a name or special properties? Can it be extended to higher dimensions? This matrix has special structure. And it looks very like the skew-symmetric matrix associating a vector $v=[a,b,c]^T$.


Edit: It is my fault that I didn't find the pattern before I post the question. It is not very clear for 2 by 2 matrices. When it comes to higher dimensions, the pattern is clearer.

Consider a vector $x=[x_1,...,x_5]^T$, the associating matrix is $$X=\left( \begin{array}{ccc} 0 & x_2 & 0 & 0 & x_1 \\ x_2 & 0 & x_3 & 0 & 0 \\ 0 & x_3 & 0 & x_4 & 0 \\ 0 & 0 & x_4 & 0 & x_5 \\ x_1 & 0 & 0 & x_5 & 0 \\ \end{array} \right) $$ In the above matrix, every column contains only two non-zero entries, and every row contains only two non-zero entries. Does this matrix have a name? Can we claim that: if $Xy=0$ and the vector $y$ has no zero entries, then $X=0$?

$\endgroup$
2
  • $\begingroup$ A fine way is to say a symmetrical matrix minus its diagonal. $\endgroup$
    – checkmath
    Commented May 24, 2012 at 5:09
  • 1
    $\begingroup$ If the row 2, column 2 entry of this matrix were $b$ instead of $0$, this would be an example of a Hankel matrix (even a circulant Hankel matrix). I can't think of any non-contrived generalization of the property you mentioned for a larger family of matrices, though. $\endgroup$ Commented May 24, 2012 at 5:34

3 Answers 3

1
$\begingroup$

(I intended this to be a comment, but it got slightly too long--- so it is an answer.)

For any $m \times n$ matrix $A$, the following two conditions are equivalent:

  • There is a $n \times 1$ vector $x$, with all entries nonzero, for which $Ax = 0$.
  • Every column of $A$ can be written as a linear combination of the other columns of $A$.

Compare to the better known equivalence: there is a nonzero vector $x$ for which $Ax = 0$ if and only if the columns of $A$ are linearly independent (i.e., if and only if some column of $A$ is a linear combination of the others).

To prove the equivalence, fix an $m \times n$ matrix $A$, and denote its columns by $c_1, \dots, c_n$.

Suppose there is a column vector $x = (x_j)_{j=1}^n$ with all nonzero entries for which $Ax = 0$, and fix any $1 \leq k \leq n$. As the matrix product $Ax$ is $\sum_{j=1}^n x_j c_j$ by definition, we deduce that $\sum_{j=1}^n x_j c_j = 0$, and hence $x_k c_k = -\sum_{j \neq k} x_j c_j$, and hence $c_k = -\sum_{j \neq k} x_k^{-1} x_j c_j$ is a linear combination of the other columns of $A$. As $k$ was arbitrary, every column of $A$ is a linear combination of the other columns of $A$.

Conversely, suppose that every column of $A$ is a linear combination of the other columns of $A$. It follows that for each $1 \leq k \leq n$ there is a column vector $x^k = (x_j^k)_{j=1}^n$ with $x_k^k = 1$ and $A x^k = 0$. For each $1 \leq r \leq n$, let $H_r$ denote the set of all tuples $(\alpha_1,\dots,\alpha_n)$ in $\mathbb{R}^n$ for which the $r$th entry of the linear combination $\sum_{k=1}^n \alpha_k x^k$ is zero. The set $H_r$ is evidently a subspace of $\mathbb{R}^n$, and as $x_r^r = 1$, the $n$-tuple whose $r$th entry is $1$ and whose other entries are all $0$ is not in $H_r$. So for all $1 \leq r \leq n$, the set $H_r$ is in fact a proper subspace of $\mathbb{R}^n$. Since no vector space over an infinite field is a finite union of proper subspaces, there must be some tuple $(a_1, \dots, a_n)$ in $\mathbb{R}^n$ that is not in $H_1 \cup \dots \cup H_n$. It follows that all entries of the column vector $x = \sum_{k=1}^n a_k x^k$ are nonzero, and clearly $Ax = 0$.

End of proof of equivalence.

This gives us at least some alternative perspective on why your matrix has the property that it has. (A short bit of thinking shows that if its first column is a linear combination of the other two, then $c = 0$ is forced, but that in this situation the third column is not a linear combination of the other two unless $a = b = 0$).

[Added after the edit of the question]

Seeing another example of the pattern helped me a lot. I don't think that pattern has a name, but I can answer the question about that $X$ in a way that might indicate a method that works in the general case. The answer: it is true if $X$ has the property that there is a vector $y$ with all entries nonzero with $Xy = 0$, then $X = 0$.

Here is a proof: suppose that $Xy = 0$ and that $y$ has no nonzero entries. Denoting the entries of $y$ by $y_1,\dots,y_5$, from computing the matrix product $Xy$ we see that $Xy = 0$ is equivalent to the system of equations \begin{align} x_2 y_2 + x_1 y_5 & = 0, \\ x_2 y_1 + x_3 y_3 & = 0, \\ x_3 y_2 + x_4 y_4 & = 0, \\ x_4 y_3 + x_5 y_5 & = 0, \\ x_1 y_1 + x_5 y_4 & = 0, \end{align} which is, in turn, equivalent to the matrix equation $Mx = 0$, where $M$ is the matrix $$ M = \begin{pmatrix} y_5 & y_2 & 0 & 0 & 0 \\ 0 & y_1 & y_3 & 0 & 0 \\ 0 & 0 & y_2 & y_4 & 0 \\ 0 & 0 & 0 & y_3 & y_5 \\ y_1 & 0 & 0 & 0 & y_4 \end{pmatrix} $$ and $x$ is the column vector with $x_1, \dots, x_5$ as its entries.

A short computation shows that the determinant of the matrix $M$ is $2 y_1 y_2 y_3 y_4 y_5$, which is nonzero by the hypothesis on $y$, and hence $M$ is invertible. So from $Mx = 0$ we deduce that $x = 0$, which implies that your original matrix $X$ is the zero matrix. End of proof.

This proof method probably generalizes to other matrices $X$ of the type you want. To outline the strategy: given a matrix equation $Xy = 0$ with all entries of $y$ nonzero, the rewrite $Xy = 0$ as an equivalent matrix equation $Mx = 0$, where $x$ is a column vector with $x_1, \dots, x_n$ as its entries, and $M$ is an $n \times n$ matrix whose entries are either zeros or $y_1,\dots,y_n$ (the "pattern" of this matrix is determined by the "pattern" of the $X$ matrix in a way that I can't generally describe in a simple way at the moment, but some simple description is probably possible). It seems likely to me that the "pattern" of entries in the $X$ matrix will imply, in some generality, that the corresponding "pattern" of $y_j$'s in the $M$ matrix leads to a matrix $M$ whose determinant is a multiple of $y_1 y_2 y_3 y_4 y_5$, and hence to an invertible $M$ whenever all entries of $y$ are are nonzero.

That's hazy on generality, of course, but it is a concrete guess to work on, anyway.

$\endgroup$
3
  • $\begingroup$ Thanks a lot! I further investigate the matrix in higher dimensions, and I found the pattern is much clearer (see the revised question). I believe the special property of this matrix is: if $Xy=0$ and $y$ has no zero entries, then $X=0$. $\endgroup$
    – Shiyu
    Commented May 25, 2012 at 3:51
  • $\begingroup$ @Shiyu I have added a discussion for this $X$. A general result in this area is still beyond me, but maybe my approach for this $X$ has some useful ideas in it. $\endgroup$ Commented May 25, 2012 at 5:46
  • $\begingroup$ The answer is very clear. Many thanks. $\endgroup$
    – Shiyu
    Commented May 25, 2012 at 6:56
1
$\begingroup$

Well you can say that you matrix satisfies

$$a_{ij}=(-1)^{\delta_{ij}}a_{ji}$$

Where $\delta_{ij}=1$ iff $ i=j $ and $0$ case contrary, that is, the Kronecker's Delta.

$\endgroup$
3
  • 2
    $\begingroup$ I.e. it is symmetric with zeros on the diagonal. $\endgroup$ Commented May 24, 2012 at 5:32
  • $\begingroup$ Yeah it mixes the definition of symmetric and anti-symmetric matrix. $\endgroup$
    – checkmath
    Commented May 24, 2012 at 13:35
  • $\begingroup$ For 2 by 2 matrix, the matrix is merely a symmetric matrix with zero diagonal entries. But for higher dimensions, it patten of my matrix is clearer. Please refer to the revised question. Thanks. $\endgroup$
    – Shiyu
    Commented May 25, 2012 at 3:49
1
$\begingroup$

As far as I have seen, it is not a named matrix, but it is symmetric, with all the nice things that implies.

I'm not sure if I understand how you want to generalize it to higher dimensions; it is relatively easy to come up with symmetric matrices that have zeros in diagonals. It doesn't have nice eigenvectors. Its determinant is simple, but it's not clear how to generalize that (an arbitrary symmetric matrix with zeros on the diagonal does not have particularly simple determinant, it seems)

$\endgroup$
5
  • 1
    $\begingroup$ A matrix having $LU$ decomposition with $U=I$ is lower triangular. Are you referring to an $LUP$ decomposition (using a permutation matrix)? $\endgroup$ Commented May 24, 2012 at 5:13
  • $\begingroup$ Um, yes, that's right. $\endgroup$ Commented May 24, 2012 at 5:26
  • $\begingroup$ Still, how do you get $U=I$? $\endgroup$ Commented May 24, 2012 at 5:43
  • $\begingroup$ Actually, you don't. I was assuming that $U$ was the result after rref; of course it's actually the result after ref. Redoing the calculations, $U$ is not particularly pleasant and it doesn't generalize. Sorry about that $\endgroup$ Commented May 24, 2012 at 11:33
  • $\begingroup$ Thanks! I now get what this matrix looks like in high dimensions. Please see the revised question. $\endgroup$
    – Shiyu
    Commented May 25, 2012 at 3:46

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .