11
$\begingroup$

As the title says, I'm wondering whether there exists a basis for the set of $2\times 2$ matrices (with entries from the real numbers) such that all basis elements are invertible.

I have a gut feeling that it is false, but don't know how to prove it. I know that for a matrix to be invertible, it must be row equivalent to the identity matrix and I think I may be able to use this in the proof, but I don't know how.

Thanks in advance for any help,

Jack

$\endgroup$
3
  • $\begingroup$ Please check if this question helps. I'm posting it in case you are familiar with symbols: math.stackexchange.com/questions/534875/… $\endgroup$ Commented Nov 5, 2016 at 15:53
  • $\begingroup$ See Pauli Matrices. $\endgroup$ Commented Nov 6, 2016 at 0:57
  • 3
    $\begingroup$ If you pick four random $2\times2$ matrices (using e.g. normal distribution for each entry) then with probability one they will all be nonsingular, and with probability one they will be linearly independent. $\endgroup$
    – bof
    Commented Nov 6, 2016 at 5:36

5 Answers 5

11
$\begingroup$

Even without finding such a basis, you can see that singular matrices form a hypersurface in $ M_{n \times n}(F) $ given by the null set of the determinant map $ \det : M_{n \times n}(F) \to F $. When $ F = \mathbb R $, for instance, the set of all singular matrices is a closed subset of $ M_{n \times n}(F) $ (it is the preimage of $ \{ 0 \} $, which is closed, under the continuous determinant map) which is not all of the space, therefore there is an open ball lying outside of this set. As you can prove, we can then find a basis lying in this open ball, hence consisting of invertible matrices.

Explicit counterexamples have been given in other answers, so I will not mention any here.

$\endgroup$
1
  • $\begingroup$ You can also use the results of this answer to get explicit results in the nxn case. $\endgroup$
    – Batman
    Commented Nov 5, 2016 at 17:06
9
$\begingroup$

Consider the counter-example basis: $$\beta:=\left\{\begin{pmatrix} 1&0\\0&1\end{pmatrix},\begin{pmatrix} 0&1\\1&0\end{pmatrix},\begin{pmatrix} 1&0\\1&1\end{pmatrix},\begin{pmatrix} 0&1\\1&1\end{pmatrix}\right\}$$

Let $A=\begin{pmatrix} x_1&x_2\\x_3&x_4\end{pmatrix}$, let the following equation:

$$A=a\begin{pmatrix} 1&0\\0&1\end{pmatrix}+b\begin{pmatrix} 0&1\\1&0\end{pmatrix}+c\begin{pmatrix} 1&0\\1&1\end{pmatrix}+d\begin{pmatrix} 0&1\\1&1\end{pmatrix}$$ Then $$\begin{pmatrix} 1&0&1&0\\0&1&0&1\\0&1&1&1\\1&0&1&1\end{pmatrix}\begin{pmatrix} a\\b\\c\\d\end{pmatrix}=\begin{pmatrix} x_1\\x_2\\x_3\\x_4\end{pmatrix}$$ Check $\det\begin{pmatrix} 1&0&1&0\\0&1&0&1\\0&1&1&1\\1&0&1&1\end{pmatrix}=1$ . So there is an unique solution for $A$, hence $\beta$ is a basis of $2$ by $2$ matrix, where all basis elements are invertible.

$\endgroup$
3
  • $\begingroup$ Thanks a lot Nick very helpful explanation. $\endgroup$
    – Jack
    Commented Nov 5, 2016 at 16:02
  • 1
    $\begingroup$ The first line is a bit confusing. You're presenting an example rather than counterexample here, aren't you? $\endgroup$ Commented Nov 5, 2016 at 17:34
  • 1
    $\begingroup$ @leftaroundabout: An example is a counterexample to the claim that no examples exist. $\endgroup$ Commented Nov 5, 2016 at 17:47
7
$\begingroup$

\begin{align*} \left[ \begin{matrix} 0 & 1 \\ 1 & 1 \end{matrix} \right], \left[ \begin{matrix} 1 & 0 \\ 1 & 1 \end{matrix} \right], \left[ \begin{matrix} 1 & 1 \\ 0 & 1 \end{matrix} \right], \left[ \begin{matrix} 1 & 1 \\ 1 & 0 \end{matrix} \right] \end{align*}

$\endgroup$
1
  • 1
    $\begingroup$ Thanks Logician6. I appreciate your help, $\endgroup$
    – Jack
    Commented Nov 5, 2016 at 16:03
7
$\begingroup$

I just want to point out that, in order to prove that the answer to a question of the form

Does there exist a basis for $M_n(\mathbb{R})$ consisting of [matrices of some special form]?

is "yes", you only actually need to show that the special matrices span all of $M_n(\mathbb{R})$, since any spanning set can be reduced to a basis.

It's usually a lot easier to check that some special set matrices are spanning than it is to explicitly describe a basis. Let's look at the case you asked about---invertible matrices. Let $A$ be any matrix. Take $\lambda$ to be a nonzero number which isn't an eigenvalue of $A$ (which can be done because $A$ can have at most $n$ eigenvalues). Then we have $$A = \lambda I + (A-\lambda I)$$ where $\lambda I$ and $A-\lambda I$ are both invertible. Since every matrix can be written as a sum of two invertibles, the invertibles are spanning, and can, in principal, be reduced to a basis.

$\endgroup$
2
$\begingroup$

As a more explicit version of Starfall's answer, in $n\times n$ case, if you denote the standard basis of matrix space by $(E_{i,j})$, you can define a basis simply by $E'_{1,1}=I$, $E'_{i,j}:=I+E_{i,j}$ if one of $i,j$ is not $1$. (You could also take simply $E'_{1,1}:=I+E_{1,1}$, but that would make it harder to show that $(E_{i,j})$ form a basis.)

It is very easy to check that each $E'_{i,j}$ is invertible (just calculate the determinant), the span of $(E'_{i,j})$ contains all $E_{i,j}$ (this is immediate except for $E_{1,1}$, and for $E_{1,1}$ it follows from the observation for $i=j>1$), and since there are $n^2$ of $(E'_{i,j})$, they must form a basis (because the space of matrices is $n^2$-dimensional).

In $2\times 2$ case, the basis is $$ \left[ \begin{matrix} 1 & 0 \\ 0 & 1 \end{matrix} \right], \left[ \begin{matrix} 1 & 1 \\ 0 & 1 \end{matrix} \right], \left[ \begin{matrix} 1 & 0 \\ 1 & 1 \end{matrix} \right], \left[ \begin{matrix} 1 & 0 \\ 0 & 2 \end{matrix} \right] $$

(Note that more or less by the same argument as the one cited by Starfall, if you just pick $n^2$ random $n\times n$ matrices (for any reasonable notion of "random", but you can think of normally, independendently distributed coefficients, or independently uniformly distributed in a fixed interval), they will all be invertible and they will be linearly independent, and hence form a basis. This is because the set of non-invertible matrices is very small, as is the set of vectors in a proper subspace of a given real vector space.)

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .