7
$\begingroup$

Let $T:V \rightarrow V$ be a linear transformation where $V$ is a finite dimensional vector space. Let $W \subset V$ be such that $T(W) \subset W$. If $v_1, v_2, \ldots ,v_n \in V$ are eigenvectors corresponding to distinct eigenvalues $\lambda_1, \lambda_2, \ldots ,\lambda_n$ of $T$ and $v_1 + v_2 + \ldots + v_n \in W$, prove that each $v_i \in W$.

I could prove that all the above eigenvectors are linearly independent and that atleast one $v_i$ lies in $W$ but how do I show all $v_i's \in W$? Any hint is appreciated.

Note: Even though this has a homework tag, I should stress that this isn't my homework. It's been a few years since I last took a university course. I found this question while going through some (really) old papers in my desk!

$\endgroup$
3
  • 2
    $\begingroup$ Let $w = v_1 + v_2 + \dotsc + v_n$. Look at $w,\, Tw,\, T^2 w,\, \dotsc, T^{n-1}w$. Look at the Vandermonde matrix. $\endgroup$ Commented Feb 20, 2014 at 19:32
  • $\begingroup$ Thank you for the hint. Unfortunately, I still cannot seem to get it! Could you please elaborate? $\endgroup$
    – crypton480
    Commented Feb 20, 2014 at 21:32
  • $\begingroup$ Stronger hint: $T^kw=\sum λ_i^kv_i$. Consider the polynomials that are the kernel functions of the Lagrange interpolation. $\endgroup$ Commented Feb 20, 2014 at 21:37

1 Answer 1

2
$\begingroup$

This assertion can actually be proved quite simply by means of a simple induction on the number of eigenvectors $n$. It is easy to see that it is true in the case $n = 1$: if $v_1 \in W$, then $v_1 \in W$. Now suppose for example we examine the assertion in the case of two eigenvectors $v_1$, $v_2$. Then we have $v_1 + v_2 \in W$ and since $T(W) \subset W$, $T(v_1 + v_2) \in W$ as well. But $T(v_1 + v_2) = Tv_1 + Tv_2 = \lambda_1 v_1 + \lambda_2 v_2$, where $\lambda_i$ is the eigenvalue corresponding to $v_i$ for $i = 1, 2$. Thus $ \lambda_1 v_1 + \lambda_2 v_2 \in W$. We also have $\lambda_1 v_1 + \lambda_1 v_2 = \lambda_1(v_1 + v_2) \in W$ since $W$ is a subspace. From these observations we see that $(\lambda_2 - \lambda_1)v_2 = ( \lambda_1 v_1 + \lambda_2 v_2) - \lambda_1(v_1 + v_2) \in W$. Since $\lambda_2 \ne \lambda_1$ and $W$ is a subspace, we find that $v_2 \in W$, whence $v_1 = (v_1 + v_2) - v_2 \in W$ also. So $v_1, v_2 \in W$. These arguments establish the $k = 1, 2$ cases as bases for the induction; now suppose the result holds for any collection of $k$ eigenvectors $v_i$ of $T$, $T v_i = \lambda_i v_i$ for $1 \le i \le k$ with $\lambda_i \ne \lambda_j$ for $i \ne j$. Let $\sum_1^{k + 1}v_i \in W$ for the $k + 1$ eigenvectors $v_i$. Then $T(\sum_1^{k + 1}v_i) = \sum_1^{k + 1}\lambda_i v_i \in W$ along with $\lambda_{k + 1} (\sum_1^{k + 1} v_i) = \sum_1^{k + 1} \lambda_{k + 1} v_i \in W$; thus $\sum_1^k (\lambda_i - \lambda_{k+1}) v_i = \sum_1^{k + 1}\lambda_i v_i - \sum_1^{k + 1} \lambda_{k + 1} v_i \in W$. But the $(\lambda_i - \lambda_{k + 1}) v_i$, $1 \le i \le k$, are themselves eigenvectors of $T$ having distinct eigenvalues, since $T((\lambda_i - \lambda_{k + 1}) v_i) = \lambda_i (\lambda_i - \lambda_{k + 1}) v_i$. Thus by the inductive hypothesis we have $(\lambda_i - \lambda_{k + 1})v_i \in W$ for $1 \le i \le k$; thus these $v_i \in W$ since $\lambda_i - \lambda_{k + 1} \ne 0$ and by virtue of the fact that $v_{k + 1} = \sum_1^{k + 1} v_i - \sum_1^k v_i$, $v_{k + 1} \in W$ as well. The induction, and hence the proof, is thus complete. QED.

Hope this helps. Cheers,

and as always,

Fiat Lux!!!

$\endgroup$
1
  • $\begingroup$ YYYESSSSSSSSSSSS!!! $\endgroup$ Commented Feb 21, 2014 at 19:50

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .