2
$\begingroup$

Let $T$ be a linear operator on the finite-dimensional space $V$. Let $\lambda_l,…,\lambda_k$ be the distinct characteristic values of $T$ and let $W_i$ be the space of characteristic vectors associated with the characteristic value $\lambda_i$. If $W=W_1+…+W_k$, then $\text{dim}(W)= \text{dim}(W_1)+…+ \text{dim}(W_k)$.

Approach (1): We use mathematical induction. $\forall j\in J_k$, $P(j)$: $ \text{dim}(W_1+…+W_j)= \text{dim}(W_1)+…+ \text{dim}(W_j)$. Base case: $j=2$. Since $W_i=N_{T-\lambda_i I}$, we have $W_i$ is subspace of $V$. By theorem 6 section 2.3, $\mathrm{dim}(W_1)+ \mathrm{dim}(W_2)= \mathrm{dim}(W_1\cap W_2)+ \mathrm{dim}(W_1+W_2)$. It’s easy to check, $\mathrm{dim}(W_1\cap W_2)=0$$\iff$$W_1\cap W_2=\{0_V\}$. Let $x\in W_1\cap W_2$. Then $T(x)=\lambda_1\cdot x=\lambda_2\cdot x$. By distributive law, $(\lambda_1 -\lambda_2)\cdot x=0_V$. Which implies $\lambda_1-\lambda_2=0_F$ or $x=0_V$. Since $\lambda_1\neq \lambda_2$, we have $x=0_V$. Thus $W_1\cap W_2=\{0_V\}$. So $\mathrm{dim}(W_1\cap W_2)=0$. Hence $\mathrm{dim}(W_1)+ \mathrm{dim}(W_2)= \mathrm{dim}(W_1+W_2)$. Inductive step: Let $S_j=\sum_{i=1}^jW_i$. Suppose $\text{dim}(S_j)= \text{dim}(W_1)+…+ \text{dim}(W_j)$, for some $2\leq j\lt k$. Since $S_j=\text{span}(\bigcup_{i=1}^jW_i)$, we have $S_j$ is subspace of $V$. By theorem 6 section 2.3, $\text{dim}(S_j)+\text{dim}(W_{j+1})= \text{dim}(S_j \cap W_{j+1})+ \text{dim}(S_j+W_{j+1})$. So $\text{dim}(W_1)+…+ \text{dim}(W_j)+ \text{dim}(W_{j+1})= \text{dim}(S_j \cap W_{j+1})+ \text{dim}(S_{j+1})$. We need to show $S_j\cap W_{j+1}=\{0_V\}$. Let $x\in S_j\cap W_{j+1}$. Then $x\in S_j$ and $x\in W_{j+1}$. So $x=x_1+…+x_j$ for some $x_i\in W_i$. Since $T$ is linear map, we have $T(x)=T(x_1+…+x_j)=T(x_1)+…+T(x_j)=\lambda_1\cdot x_1+…+\lambda_j \cdot x_j=\lambda_{j+1}\cdot x$. So $\lambda_1 \cdot x_1+…+\lambda_j \cdot x_j= \lambda_{j+1} \cdot x_1+…+\lambda_{j+1}\cdot x_j$. By distributive law, $(\lambda_1-\lambda_{j+1})\cdot x_1+…+(\lambda_j-\lambda_{j+1})\cdot x_j=0_V$. We claim $x_i=0_V$, $\forall 1\leq i\leq j$. Let $A=\{i\in J_j|\ x_i=0_V\}$ and $B=\{i\in J_j|\ x_i\neq 0_V\}$. Assume towards contradiction, $\exists p\in J_j$ such that $x_p\neq 0_V$, i.e. $p\in B$. Then $\sum_{i=1}^j(\lambda_i-\lambda_{j+1})\cdot x_i=\sum_{i\in B} (\lambda_i-\lambda_{j+1})\cdot x_i=0_V$. $\forall i\in B$, $x_i$ is eigenvector of $\lambda_i$. It’s a standard result, if $\lambda_1,…,\lambda_m$ are distinct eigenvalue of $T$ and $v_1,…,v_m$ are corresponding eigenvector, then $\{v_1,…,v_m\}$ is linearly independent. So $\{x_i|\ i\in B\}$ is independent. Which implies $\lambda_i-\lambda_{j+1}=0_F$, $\forall i\in B$. So $\lambda_p=\lambda_{j+1}$. Thus we reach contradiction. Hence $x_i=0_V$, $\forall 1\leq i\leq j$. So $x=x_1+…+x_j=0_V$. Thus $S_j\cap W_{j+1}=\{0_V\}$$\iff$$\text{dim}(S_j\cap W_{j+1})=0$. Hence $\text{dim}(W_1)+…+ \text{dim}(W_j)+ \text{dim}(W_{j+1})=\text{dim}(S_{j+1})$. By principle of mathematical induction, $P(j)$ holds $\forall j\in J_k$. Is my proof correct?

Potential Approach (2): Since $\text{dim}(V)=n\in \Bbb{N}$, we have $\text{dim}(W_i)=r_i\leq n$. Let $B_i=\{\alpha_{i1},…,\alpha_{ir_i}\}$ be basis of $W_i$, $\forall i\in J_k$. It’s easy to check, $W_i\cap W_j=\{0_V\}$, if $i\neq j$. So $B_i\cap B_j=\emptyset$, if $i\neq j$. So $|\bigcup_{i=1}^k B_i|=|B_1|+…+|B_k|=r_1+…+r_k$. We claim $\bigcup_{i=1}^k B_i$ is basis of $W=W_1+…+W_k$. We show $\text{span}(\bigcup_{i=1}^k B_i)=W$. Let $x\in W$. Then $\exists x_i\in W_i$ such that $x=x_1+…+x_k$. Since $\text{span}(B_i)=W_i$, we have $x_i=\sum_{j=1}^{r_i}b_{ij}\cdot \alpha_{ij}$, $\forall i\in J_k$. So $x= \sum_{j=1}^{r_1}b_{1j}\cdot \alpha_{1j}+…+ \sum_{j=1}^{r_k}b_{kj}\cdot \alpha_{kj}\in \text{span}(\bigcup_{i=1}^kB_i)$. Thus $\text{span}(\bigcup_{i=1}^kB_i)=W$. We show $\bigcup_{i=1}^k B_i$ is linearly independent. If $\sum_{j=1}^{r_1}c_{1j}\cdot \alpha_{1j}+…+\sum_{j=1}^{r_k}c_{kj}\cdot \alpha_{kj}=0_V$, for some $c_{ij}\in F$. Let $x_i=\sum_{j=1}^{r_i}c_{ij}\cdot \alpha_{ij}\in \text{span}(B_i)=W_i$. Then $x_1+…+x_k=0_V$. We claim $x_i=0_V$, $\forall i\in J_k$. Let $A=\{i\in J_k|\ x_i=0_V\}$ and $B=\{i\in J_k|\ x_i\neq 0_V\}$. So $\sum_{i=1}^kx_i=\sum_{i\in B}x_i=0_V$. Since $T$ is linear map, $T(\sum_{i\in B}x_i)=\sum_{i\in B}T(x_i)=\sum_{i\in B}\lambda_i \cdot x_i=0_V$. $\forall i\in B$, $x_i$ is eigenvector of $\lambda_i$. lemma: if $\lambda_1,…,\lambda_m$ are distinct eigenvalue of $T$ and $v_1,…,v_m$ are corresponding eigenvector, then $\{v_1,…,v_m\}$ is linearly independent. So $\{x_i|i\in B\}$ is independent. Which implies $\lambda_i=0_F$, $\forall i\in B$. Since $\lambda_i \neq \lambda_j$, if $i\neq j$, we have $|B|=0$ or $1$. If $|B|=0$, then we’re done. If $|B|=1$, then $\exists p\in J_k$ such that $x_p\neq 0_V$. How to progress from here?


My approach (2) is partial proof. I didn’t know to use polynomial notion. Hoffman’s proof: If $\sum_{j=1}^{r_1}c_{1j}\cdot \alpha_{1j}+…+\sum_{j=1}^{r_k}c_{kj}\cdot \alpha_{kj}=0_V$, for some $c_{ij}\in F$. Let $f\in \Bbb{F}[x]$. Since $f(T)\in L(V,V)$, we have $0_V=[f(T)](\sum_{j=1}^{r_1}c_{1j}\cdot \alpha_{1j}+…+\sum_{j=1}^{r_k}c_{kj}\cdot \alpha_{kj})=\sum_{j=1}^{r_1}c_{1j}\cdot [f(T)](\alpha_{1j})+…+ \sum_{j=1}^{r_k}c_{kj}\cdot [f(T)](\alpha_{kj})$. By lemma 2 section 6.2, $[f(T)](\alpha_{ij})=f(\lambda_i)\cdot \alpha_{ij}$. So $\sum_{j=1}^{r_1}c_{1j}\cdot [f(T)](\alpha_{1j})+…+ \sum_{j=1}^{r_k}c_{kj}\cdot [f(T)](\alpha_{kj})= \sum_{j=1}^{r_1}c_{1j}\cdot f(\lambda_1)\cdot \alpha_{1j}+…+ \sum_{j=1}^{r_k}c_{kj}\cdot f(\lambda_k)\cdot \alpha_{kj}=0_V$, $\forall f\in F[x]$. Define $f_i=\prod_{j\in J_k-\{i\}}(x-\lambda_j)$, $\forall i\in J_k$. Then $\sum_{j=1}^{r_1}c_{1j}\cdot f_i(\lambda_1)\cdot \alpha_{1j}+…+ \sum_{j=1}^{r_k}c_{kj}\cdot f_i(\lambda_k)\cdot \alpha_{kj}=\sum_{j=1}^{r_i}c_{ij}\cdot f_i(\lambda_i)\cdot \alpha_{ij}=0_V$. Since $B_i$ is independent, we have $c_{ij}\cdot f_i(\lambda_i)=0_F$, $\forall 1\leq j\leq r_i$. Which implies $c_{ij}=0_F$ or $f_i(\lambda_i)=0_F$. Since $f_i(\lambda_i)\neq 0_F$, we have $c_{ij}=0_F$, $\forall 1\leq j\leq r_i$. Since $i$ was arbitrary, $c_{ij}=0_F$, $\forall i\in J_k$. Hence $\bigcup_{i=1}^kB_i$ is independent.

Edit: We can make potential approach (2) work. Claim, $x_1+…+x_k=0_V$$\implies$$x_i=0_V$, $\forall i\in J_k$ is same as $W_1\oplus \dotsb \oplus W_k$. Here is proof of $W_1\oplus \dotsb \oplus W_k$ using mathematical induction. Below (in comment) my proof of $\oplus_{i=1}^k W_i$ is incorrect & stupid.

$\endgroup$
11
  • $\begingroup$ Fun fact: My first draft of this post was “potential approach (1) and can we use induction to prove this theorem?”. After lots of draft (trying to make proof work), I eventually reach desired result.It took me long time and I didn’t give up. I made use of one lemma, which is not in Hoffman’s book. I would really appreciate if you check my approach (1) and give feedback for approach (2). I know it is long. $\endgroup$
    – user264745
    Commented Nov 6, 2022 at 14:15
  • $\begingroup$ The spaces $W_\bullet$ are linearly disjoint so $W=\bigoplus_i W_i$ and the dimension-counting follows. $\endgroup$
    – FShrike
    Commented Nov 6, 2022 at 16:25
  • $\begingroup$ @FShrike What’s definition of linearly disjoint. I assume $W_i$‘s are subspace of $V$ and pairwise disjoint. Which implies $W$ is direct sum of $W_1,…,W_k$. $\endgroup$
    – user264745
    Commented Nov 6, 2022 at 17:52
  • 1
    $\begingroup$ @user264745 You can also prove that $W_1+\dots+W_k$ is a direct sum by using the theorem that eigenvectors corresponding to distinct eigenvalues are linearly independent. Then the only sum in $W_1+\dots+W_k$ that equals the additive identity of $V$ is $\underbrace{0+\dots+0}_\text{k times}$. $\endgroup$
    – Seeker
    Commented Nov 6, 2022 at 20:24
  • 1
    $\begingroup$ @Seeker yess.. I’m using “eigenvectors corresponding to distinct eigenvalues are linearly independent” theorem. I’m just calling it lemma. $\endgroup$
    – user264745
    Commented Nov 6, 2022 at 20:33

1 Answer 1

1
$\begingroup$

Suppose $$v_1+\dots+v_k=0$$ for $v_j\in W_j$. Because eigenvectors corresponding to distinct eigenvalues are linearly independent, we have that $v_1=\cdots =v_k=0$.

Thus, $W_1+\dots+W_k$ is a direct sum.

$\endgroup$
9
  • $\begingroup$ Thank you for the answer. I have following questions: (1) Are you sure vectors $v_1,\dots,v_k$ also form a basis of $W_1+\dots+W_k$? Because I think that is not correct. (2) Let $v\in W_1+\dots+W_k$. Then $v=x_1+…+x_k$, for some $x_i\in W_i$. You wrote $v=a_1v_1+\dots+a_kv_k$. I assume you think $v_i$ spans $W_i$. That is not true in general. $\endgroup$
    – user264745
    Commented Nov 23, 2022 at 9:00
  • 1
    $\begingroup$ @user264745 I just realised a much simpler proof. Have a look at that. You were right about your questions. I was wrong about those two questions. $\endgroup$
    – Seeker
    Commented Nov 23, 2022 at 9:10
  • 1
    $\begingroup$ I think, after using theorem we conclude $1_F=0_F$, not $v_j=0$ for all $j$. We can’t get $v_j=0$ because in beginning of proof we assume $v_j\neq 0$. $\endgroup$
    – user264745
    Commented Nov 23, 2022 at 9:40
  • 1
    $\begingroup$ @user264745 You make a good point. I wrote up the proof in a hurry and didn't think too much about it. $\endgroup$
    – Seeker
    Commented Nov 23, 2022 at 9:42
  • 1
    $\begingroup$ My point are not good. I also sometimes write proof in hurry and later realize it contains errors. IMO writing proof (right or wrong) is much better than doing nothing. If you read my potential approach (2), you’ll find out the place I got stuck in that proof is essentially same as last step of $W_1\oplus \dotsb \oplus W_k$. $\endgroup$
    – user264745
    Commented Nov 23, 2022 at 10:08

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .