2
$\begingroup$

I have the following prediction for rank-1 perturbations of diagonal matrices, but I don't know how to prove (or disprove it).

Given $v:= [v_1,...,v_K] \in (0,1]^K$, we define $a:= \sum_{k=1}^K v_k = \mathbb{1}^\top v$ and $D:= \operatorname{diag}(av_1, ..., av_K)$. It's easy to observe that the matrix $A:= D-vv^\top$ has the properties : (i) all entries are negative except on the diagonal and (ii) the sum of each column's entries is 0. Furthermore $A$ is then a symmetric positive semidefinite matrix with eigendecomposition $A= U\Sigma U^\top$.

My claim: Its square root, i.e. $Q:= U \Sigma^{1/2} U^\top$ also has the properties (i) and (ii).

Property (ii) is not difficult to show as $U^\top \mathbb{1}$ must be in the kernel of $\Sigma$ which is equal to that of $\Sigma^{1/2}$. But I don't know how to proceed with property (i).

Context: In my case $v_k = 1/n_k$ for some $n_k\in \mathbb{N}$, but I think all what we need is $v_k \in (0,1]$. Also, what I want to show is that $P:= I-Q$ is a probability matrix. I have checked this statement by generating random values for $v_k$'s and it seems to be true.

Generalization: Although I only need the statement for square root, I think it's true generally for positive power, i.e. for $Q:= U \Sigma^{t} U^\top$ for any $t\geq 0$. However, to ensure that $I-Q$ is a probability matrix perhapse we need $t<1$.

$\endgroup$
2
  • $\begingroup$ I think what you want is $P= I-\delta \cdot Q$ is a "probability matrix" (stochastic matrix) for $\delta \gt 0$ small enough. The reality is $Q$ may have arbitrarily large eigenvalues, implying e.g. $\text{trace}\big(P\big)\lt 0$ if you just use $P= I- Q$ and you can't allow the diagonal of $P$ to have negative entries. $\endgroup$ Commented Apr 23 at 17:57
  • $\begingroup$ Yes I think you are right, I "assumed" that the eigenvalues of A are all smaller than 1. In general as you said I-Q is not a stochastic matrix, however both properties I mentioned still hold. $\endgroup$
    – abcxyzf
    Commented Apr 23 at 19:40

1 Answer 1

1
$\begingroup$

Some thoughts for Property (i) for $0 < t < 1$.

We use the identity (for $0 < t < 1$ and a symmetric positive semi-definite matrix $A$) $$A^t = \frac{\sin t\pi}{\pi} \int_0^\infty u^{t - 1} A (u I + A)^{-1}\,\mathrm{d} u.$$ (Note: I knew this identity when I answered this (the reference therein). See page 114, in Introduction to Matrix Analysis and Applications.)

We can prove that $A (u I + A)^{-1}$ admits Property (i) for all $u > 0$. Indeed, we have \begin{align*} A (u I + A)^{-1} &= I - (I + u^{-1}A)^{-1}\\ &= I - (I + u^{-1}D - u^{-1}vv^\top)^{-1}\\ &= I - (I + u^{-1}D)^{-1} - \frac{(I + u^{-1}D)^{-1}u^{-1}vv^\top (I + u^{-1}D)^{-1}}{1 - u^{-1}v^\top (I + u^{-1}D)^{-1}v} \end{align*} where we use the identity $$(B - xy^\top)^{-1} = B^{-1} + \frac{B^{-1}xy^\top B^{-1}}{1 - y^\top B^{-1}x}.$$

$\endgroup$
2
  • $\begingroup$ Thanks for the answer! That's actually a nice identity $\endgroup$
    – abcxyzf
    Commented Apr 23 at 19:36
  • $\begingroup$ @abcxyzf You are welcome. $\endgroup$
    – River Li
    Commented Apr 24 at 0:34

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .