4
$\begingroup$

Let $\mathrm{M}_n(\mathbb{C})$ denote the space of $n\times n$ complex matrices, let $\mathcal{A}\subset\mathrm{M}_n(\mathbb{C})$ be any nonempty subset of matrices, and consider the set of matrices $$ \mathcal{A}^*\mathcal{A} = \{A^*B\, :\, A,B\in\mathcal{A}\}. $$ Suppose that $\mathcal{A}^*\mathcal{A}$ is a family of commuting matrices and suppose further that there exist matrices $A_1,\dots,A_N\in\mathcal{A}$ such that $A_1^*A_1+\cdots+A_N^*A_N=I$ where $I$ is the $n\times n$ identity matrix.

Question: Is it necessarily the case that $\mathrm{span}(\mathcal{A})$ contains an invertible matrix?


Here are some of my thoughts:

One may suppose without loss of generality that $\mathcal{A}=\mathrm{span}(\mathcal{A})$ (i.e., $\mathcal{A}$ is a linear subspace of matrices), since $$ \mathrm{span}(\mathcal{A}^*\mathcal{A}) = \mathrm{span}\bigl((\mathrm{span}(\mathcal{A}))^*(\mathrm{span}(\mathcal{A})\bigr). $$ (Edit: Note that each matrix in $\mathcal{A}^*\mathcal{A}$ is normal, since $A^*B\in\mathcal{A}^*\mathcal{A}$ implies $(A^*B)^*=B^*A\in\mathcal{A}^*\mathcal{A}$ and these matrices must commute.) Since $\mathcal{A}^*\mathcal{A}$ is a family of normal commuting matrices, there exists a unitary matrix $V$ such that $V^*A^*BV$ is a diagonal matrix for each $A,B\in\mathcal{A}$.

We may write each of the matrices $A_1,\dots,A_N$ in their polar decomposition as $$ A_i = U_i P_i $$ for some unitary matrices $U_1,\dots,U_N$ and positive semidefinite matrices $P_1,\dots,P_N$. Now the matrix $V^*A_i^*A_iV=V^*P_i^2V$ is diagonal for each $i$ and thus $V^*P_iV$ is diagonal for each $i$. One has that $$ (V^*P_1V)^2+ \cdots + (V^*P_NV)^2 = V^*(P_1^2+\cdots+P_N^2)V = V^*(A_1^*A_1+\cdots+A_N^*A_N)V=V^*V = I. $$ In particular, it follows that $P_1^2 + \cdots + P_N^2=I$. Since each of the matrices $V^*P_iV$ is diagonal and positive, we have that $$ V^*\bigl(\sum_{i=1}^NP_i^2\bigr)V = I \quad\Rightarrow\quad V^*\bigl(\sum_{i=1}^NP_i\bigr)V >0 $$ hence $\sum_{i=1}^NP_i$ is positive definite and thus invertible.

But this is not quite what I want because it is not in $\mathcal{A}$......

$\endgroup$
12
  • $\begingroup$ A possibly relevant result: if a subspace does not contain invertible matrices, then its dimension is at most $n^2 - n$ $\endgroup$ Commented Jan 7, 2020 at 8:05
  • 1
    $\begingroup$ A possibly useful observation: as you note, since $\mathcal{A}^*\mathcal{A}$ is a family of commuting matrices, there exists a unitary matrix $V$ such that $V^*A^*BV$ is a diagonal matrix for each $A,B\in\mathcal{A}$. So, by considering the set $$ \tilde {\mathcal A} = \mathcal A V = \{AV : A \in \mathcal A\}, $$ we can assume without loss of generality (by replacing $\mathcal A$ with $\tilde{\mathcal A}$) that $\mathcal A$ is a set for which $A^*B$ is diagonal for all matrices $A,B \in \mathcal A$. $\endgroup$ Commented Jan 7, 2020 at 8:16
  • $\begingroup$ The assumption that $\mathcal A$ is commuting is necessary. Otherwise, the space of matrices that only have non-zero entries in the first row would be a counterexample. $\endgroup$ Commented Jan 7, 2020 at 8:23
  • $\begingroup$ There is a mistake. In fact, i) there is a unitary $V$ s.t. the $V^*A^*BV$ are upper triangular. ii) There is a unitary $W$ s.t. the $W^*A^*AW$ are diagonal $\geq 0$. Beware, we may have $V\not= W$. $\endgroup$
    – user91684
    Commented Jan 7, 2020 at 10:48
  • $\begingroup$ @loupblanc I believe you have missed some assumptions. Because the set of matrices $\{A^*B\,:\, A,B\in\mathcal{A}\}$ is a commuting family, each matrix in this set is normal and they are simultaneously diagonalizable. $\endgroup$ Commented Jan 7, 2020 at 12:27

1 Answer 1

3
$\begingroup$

Since $A^\ast B$ commutes with $B^\ast A=(A^\ast B)^\ast$ for all $A,B\in\mathcal A$, the members of $\mathcal A^\ast\mathcal A$ are normal matrices. Therefore $\mathcal A^\ast\mathcal A$ is a family of commuting normal matrices, which can be simultaneously unitarily diagonalised. Thus we may assume that $D_{ij}:=A_i^\ast A_j$ is a diagonal matrix for any $i,j\in\{1,2,\ldots,N\}$.

Let $\mathbf x=(x_1,\ldots,x_N),\mathbf y=(y_1,\ldots,y_N)\in\mathbb R^N$ and consider the diagonal matrix $$ S=\left(\sum_{i=1}^Nx_iA_i\right)^\ast\left(\sum_{j=1}^Ny_jA_j\right)=\sum_{i,j}x_iy_jD_{ij}. $$ Denote the real part of the $k$-th diagonal entry of $D_{ij}$ by $d_{ijk}$. The real part of the $k$-th diagonal entry of $S$ is then given by the polynomial function $f_k(\mathbf x,\mathbf y)=\sum_{i,j}x_iy_jd_{ijk}$. By assumption, $\sum_{i=1}^ND_{ii}=I$. Therefore, for each $k$, there is an index $i$ such that $d_{iik}\ne0$. Hence each $f_k$ is a non-constant polynomial with real coefficients. Consequently, the union of the zero sets of $f_1,f_2,\ldots,f_N$ has Lebesgue measure zero. Thus there exists some $(\mathbf x,\mathbf y)$ such that $f_k(\mathbf x,\mathbf y)\ne0$ for every $k$. But then all diagonal entries of $S$ have nonzero real parts, meaning that the diagonal matrix $S$ is nonsingular. Hence $\sum_{i=1}^Nx_iA_i$ is nonsingular too.

$\endgroup$
1
  • $\begingroup$ This is awesome! $\endgroup$ Commented Jan 7, 2020 at 14:09

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .