42
$\begingroup$

Let's call a real matrix of size $m \times n$ totally invertible if for every $k$ rows and $k$ columns that we choose, we get an invertible matrix. I am curious about the following:

Is there a totally invertible matrix for all sizes $m \times n$?

By taking the transpose, we can assume $m \leq n$.

For $m=1$ we can take any vector in $\Bbb R^m$ without a zero entry.

For $m=2$ we can take $\begin{pmatrix} 1 &2 & 3 & ... &n \\ 2 & 3 & 4 & ... & n+1 \end{pmatrix}$. Indeed, $\det\begin{pmatrix} a &b \\ a+1 & b+1 \end{pmatrix}=a-b$ is nonzero when $a \neq b$. This does not generalize, since $a_{ij}=i+j-1$ fabulously fails for all submatrices of size $\geq 3$.

I also tried taking the rows to be $m$ of the cyclic shifts of the vector $(1,2, ... ,n)$. This does not work either because I found a $k=3, m=n=5$ counterexample.

I do feel that the answer should be positive, however. Is it?

$\endgroup$
8
  • 24
    $\begingroup$ Well if the entries are random selected randomly from $[0,1]$, the probability that any of the square submatrices will be singular is zero. So yes, there are plenty of examples. $\endgroup$
    – Aweygan
    Commented Aug 8, 2016 at 19:06
  • 1
    $\begingroup$ Oh, your comment made me realize: can we just say that this is the intersection of finitely many zariski-open subsets of $\Bbb R^{mn}$, hence nonempty? Well, if it all works over the non-algebraically-closed field $\Bbb R$, this sounds great, since surely every condition of a particular submatrix being invertible defines a closed algebraic set. Also, I don't know how to explain what you wrote formally. (You mean the lebesgue measure of the set of all bad matrices in $\Bbb R^{mn}$ is zero? why is that so?) $\endgroup$
    – Emolga
    Commented Aug 8, 2016 at 19:12
  • 1
    $\begingroup$ @Aweygan Regarding your suggestion over $\Bbb N$, I believe that this link is a counterexample: math.stackexchange.com/a/180892/116490 . And the $\Bbb N$ case interests me indeed. $\endgroup$
    – Emolga
    Commented Aug 8, 2016 at 19:20
  • 2
    $\begingroup$ The random argument probably works if you allow arbitrary large positive integers: you only need the expected number of non-invertible sub-matrices to be less than $1$ to draw the conclusion. $\endgroup$
    – Henry
    Commented Aug 8, 2016 at 19:23
  • 1
    $\begingroup$ @Henry: indeed, see this paper for a related argument. $\endgroup$ Commented Aug 8, 2016 at 19:30

5 Answers 5

47
$\begingroup$

If you only need an existential proof, just fill in the elements of an infinite matrix as follows. Set the first element to any nonzero real number. Then for every unfilled element, fill it by a number that is transcendental to the field of fractions generated by the previously filled elements (hence this new element is nonzero). Then any finite submatrix of this infinite matrix would be totally invertible.

There are also some meaningful classes of totally invertible matrices. E.g. any submatrix of a totally positive matrix is totally invertible. Some classes of totally positive matrices can be parametrised. See this set of slides entitled "The Grassmannian: total positivity" by Alexander Yong, for instance.

$\endgroup$
31
$\begingroup$

Yes, even with integer entries.

Note that if all of the entries of a $k\times k$ matrix are at most $M$ in absolute value, then its determinant is at most $k!M^k$ in absolute value. (This is because the determinant is a degree-$k$ polynomial in the entries of the matrix with $k!$ terms.)

It follows that if one entry of a $k\times k$ matrix is greater than $k!$ times the $k$th power of every other entry in absolute value, then the determinant can't possibly be $0$ (use a cofactor expansion along a row or column containing the big entry) and thus the matrix is invertible. Note: this isn't exactly right, but it is right if that entry's cofactor matrix is invertible. The construction below will have this property inductively.

Consequently, the $n\times n$ matrix $A=(a_{ij})$ defined by something like $$ a_{ij} = (n!+1)^{n^{i+j}} $$ is totally invertible: the lower right entry of every $k\times k$ submatrix is sufficiently larger than the other entries. (And of course, the $m\times n$ case follows from the $n\times n$ case.)

$\endgroup$
2
  • $\begingroup$ Ah, so we can do it fine in $\mathbb{Z}$. New question: What if $R$ is some finite (commutative) ring, and we want to consider $m\times n$ matrices over $R$ such that every $k\times k$ sub-determinant is non-zero (or, for another (more demanding) version, that every sub-determinant is a unit in $R$)? If too difficult, let $R$ be a finite field. $\endgroup$ Commented Aug 9, 2016 at 12:11
  • $\begingroup$ @JeppeStigNielsen Pathological cases already pop up over finite fields, e.g. when $m,n>1$, there does not exist any $m\times n$ totally invertible matrix over $GF(2)$. But you may consider those matrices whose dimensions are smaller than the field's characteristic, and see if one can always construct a totally invertible matrix. $\endgroup$
    – user1551
    Commented Aug 9, 2016 at 12:48
10
$\begingroup$

Given $m$ and $n$ there is only a finite number $N$ of possibilities to choose a positive $k\leq\min\{m,n\}$ and then $k$ rows and $k$ columns from an $m\times n$-matrix $X$ with variable entries $x_{jk}$ in order to obtain a quadratic submatrix $[S]$. Each such choice determines a polynomial $p(x):=\det[S]$ in the $m\cdot n$ variables $x_{jk}$, and the set $\{x\in{\mathbb R}^{m\cdot n}\>|\>p(x)=0\}$ is then a "variety" of dimension $mn-1$ in ${\mathbb R}^{m\cdot n}$. The union of these $N$ varieties has $m\cdot n$-dimensional Lebesgue measure $0$, which implies that "most" points $x\in{\mathbb R}^{m\cdot n}$, i.e., "most" $m\times n$-matrices $X$, will satisfy your condition.

$\endgroup$
6
$\begingroup$

Invertible Vandermonde matrices $V(a_1,a_2, \cdots, a_n)$ (https://en.wikipedia.org/wiki/Vandermonde_matrix) are "in general" in this case.

See this with an immediate recurrence.

"In general" because there is a certain number of subtle conditions on coefficients $a_k$ that have to be fullfilled (see the long remark by Aaron Meyerowitz).

$\endgroup$
2
  • 4
    $\begingroup$ That's not true. Counterexample: the $3\times3$ Vandermonde matrix generated by $-1,0,1$ has a singular $2\times2$ submatrix. What is true is that a Vandermonde matrix generated by distinct positive numbers is totally positive, and hence totally invertible. $\endgroup$
    – user1551
    Commented Aug 8, 2016 at 19:55
  • $\begingroup$ I agree. See the modifications I have made. $\endgroup$
    – Jean Marie
    Commented Aug 8, 2016 at 20:07
5
$\begingroup$

The condition that any particular submatrix has determinant zero is a polynomial equation — the solution set has measure zero in the space of all matrices.

There are finitely many submatrices; the set for which some submatrix has determinant zero is is a finite union of measure zero sets, and thus has measure zero.

Consequently, almost all matrices are totally invertible.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .