36
$\begingroup$

I know there are different definitions of Matrix Norm, but I want to use the definition on WolframMathWorld, and Wikipedia also gives a similar definition.

The definition states as below:

Given a square complex or real $n\times n$ matrix $A$, a matrix norm $\|A\|$ is a nonnegative number associated with $A$ having the properties

1.$\|A\|>0$ when $A\neq0$ and $\|A\|=0$ iff $A=0$,

2.$\|kA\|=|k|\|A\|$ for any scalar $k$,

3.$\|A+B\|\leq\|A\|+\|B\|$, for $n \times n$ matrix $B$

4.$\|AB\|\leq\|A\|\|B\|$.

Then, as the website states, we have $\|A\|\geq|\lambda|$, here $\lambda$ is an eigenvalue of $A$. I don't know how to prove it, by using just these four properties.

$\endgroup$
3
  • 2
    $\begingroup$ @mechanodroid Is it? Can you verify point 4? (The submultiplicative property of matrix norms) $\endgroup$
    – Clement C.
    Commented Jul 18, 2018 at 0:05
  • $\begingroup$ @ClementC. Sorry, my bad. $\endgroup$ Commented Jul 18, 2018 at 0:07
  • $\begingroup$ Nice question ..............+1 $\endgroup$
    – TShiong
    Commented Feb 24, 2022 at 21:37

3 Answers 3

69
$\begingroup$

Suppose $v$ is an eigenvector for $A$ corresponding to $\lambda$. Form the "eigenmatrix" $B$ by putting $v$ in all the columns. Then $AB = \lambda B$. So, by properties $2$ and $4$ (and $1$, to make sure $\|B\| > 0$), $$|\lambda| \|B\| = \|\lambda B\| = \|AB\| \le \|A\| \|B\|.$$ Hence, $\|A\| \ge |\lambda|$ for all eigenvalues $\lambda$.

$\endgroup$
1
  • 1
    $\begingroup$ Nice trick. Thank you! $\endgroup$
    – MathEric
    Commented Jul 18, 2018 at 0:12
8
$\begingroup$

Let $\|\cdot\|$ be a matrix norm.

It is known that the spectral radius $r(A) = \lim_{n\to\infty} \|A^n\|^{\frac1n}$ has the property $|\lambda| \le r(A)$ for all $\lambda\in \sigma(A)$.

Indeed, let $\lambda \in \mathbb{C}$ such that $|\lambda| > r(A)$.

Then $I - \frac1{\lambda} A$ is invertible. Namely, check that the inverse is given by $\sum_{n=0}^\infty\frac1{\lambda^n}A^n$.

This series converges absolutely because $\frac1{|\lambda|}$ is less than the radius of convergence of the power series $\sum_{n=1}^\infty \|A\|^nx^n$, which is $\frac1{\limsup_{n\to\infty} \|A^n\|^{\frac1n}} = \frac1{r(A)}$.

Hence $$\lambda I - A = \lambda\left(I - \frac1{\lambda} A\right)$$

is also invertible so $\lambda \notin \sigma(A)$.

Now using submultiplicativity we get $\|A^n\| \le \|A\|^n$ so

$$|\lambda| \le r(A) = \lim_{n\to\infty} \|A^n\|^{\frac1n} \le \lim_{n\to\infty} \|A\|^{n\cdot\frac1n} = \|A\|$$

$\endgroup$
3
  • $\begingroup$ I had a look at the Wikipedia page. It seems that the identity $r(A) = \lim_{n\to\infty} \|A^n\|^{\frac{1}{n}}$ holds for natural matrix norms, i.e. operator norms induced by norms on $\mathbb{R}^n$. I don't have a counterexample, but I suspect it doesn't hold in general. $\endgroup$ Commented Jul 18, 2018 at 0:27
  • 4
    $\begingroup$ @TheoBendit I used the abstract spectral radius defined in Banach algebras simply as $\lim_{n\to\infty} \|A^n\|^{\frac{1}{n}}$. Perhaps the name is not appropriate for matrices. Secondly, every two matrix norms are equivalent so if you take a natural matrix norm $\|\cdot\|_1$ we have $m\|\cdot\|_1 \le \|\cdot\| \le M\|\cdot\|_1$ so $$m^{1/n}\|A^n\|_1^{1/n} \le \|A^n\|^{1/n} \le M^{1/n}\|A^n\|_1^{1/n}$$ Letting $n\to\infty$ gives $\lim_{n\to\infty} \|A^n\|_1^{\frac{1}{n}} = \lim_{n\to\infty} \|A^n\|^{\frac{1}{n}}$. So it should hold for every matrix norm. $\endgroup$ Commented Jul 18, 2018 at 0:32
  • 1
    $\begingroup$ @TheoBendit Actually it is there in the Wikipedia page under the name Gelfand's Formula. I guess the only nontrivial thing in my answer is to show that the sequence $(\|A^n\|^{1/n})_n$ indeed converges. $\endgroup$ Commented Jul 18, 2018 at 0:36
0
$\begingroup$

Although this question was answered over 5 years ago, I will add this answer, which gives an alternative explanation from a linear transformation perspective.

Consider a vector-induced matrix norm $\Vert A \Vert = \mathrm{sup}_{\bar{x}\neq 0}\frac{\Vert A\bar{x}\Vert}{\Vert \bar{x} \Vert}$ of an $n\times n$ real matrix $A$ (this explanation also works if $A$ is complex, but I will leave that out for brevity).

Let $\bar{v}_1,\bar{v}_2,...\bar{v}_n $ be the normalized eigenvectors of $A$, and $\lambda_1,\lambda_2,...\lambda_n \in \mathbb R$ be the corresponding eigenvalues (this explanation also works if $\lambda_i$ are complex conjugate pairs, but I will leave that out for brevity).

Any unit vector $\bar{u}$ in the space spanned by $A$ can be given by $$\bar{u}=(w_1 \bar{v}_1 +w_2\bar{v}_2+...+w_n\bar{v}_n)$$ where, $ w_i\in \mathbb R$ are scaling factors.

Any non-zero vector $\bar{x}$ can be given as $\bar{x}=k\bar{u}$, where $k>0\in \mathbb R$

$\therefore$ $A\bar{x}=k(\lambda_1w_1\bar{v}_1 +\lambda_2w_2\bar{v}_2+...+\lambda_nw_n\bar{v}_n)$

Notice $\Vert \bar{x} \Vert=k$ and $\Vert A\bar{x}\Vert=k\Vert \lambda_1w_1\bar{v}_1 +\lambda_2w_2\bar{v}_2+...+\lambda_nw_n\bar{v}_n \Vert$

$\Vert A \Vert = \mathrm{sup}_{\bar{x}\neq 0}\frac{\Vert A\bar{x}\Vert}{\Vert \bar{x} \Vert}=\mathrm{sup}_{w_i} \Vert \lambda_1w_1\bar{v}_1 +\lambda_2w_2\bar{v}_2+...+\lambda_nw_n\bar{v}_n \Vert $

Now, notice that we can always select a unit vector $\bar{u}$ (with suitable $w_1,w_2,...,w_n$) such that $$\Vert \lambda_1w_1\bar{v}_1 +\lambda_2w_2\bar{v}_2+...+\lambda_nw_n\bar{v}_n \Vert \ge \vert \lambda_{max} \vert$$

where $\lambda_{max}=max_i\{\lambda_i\}$

$\therefore \Vert A \Vert = \mathrm{sup}_{\bar{x}\neq 0}\frac{\Vert A\bar{x}\Vert}{\Vert \bar{x} \Vert}=\mathrm{sup}_{w_i} \Vert \lambda_1w_1\bar{v}_1 +\lambda_2w_2\bar{v}_2+...+\lambda_nw_n\bar{v}_n \Vert \ge\vert \lambda_{max} \vert \ge \vert \lambda_i \vert $

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .