In trying to understand why $\sum\limits_{k=1}^{\infty} \frac{1}{2^k}$ converges but $\sum\limits_{k=1}^{\infty} \frac{1}{2k}$ doesn't, I noticed that in infinite series of the type $\sum\limits_{n=1}^{\infty} \frac{1}{k^n}$ where $k > 1$, any term is greater than the sum of any number of subsequent terms.
Whereas for example in the series $\sum\limits_{k=1}^{\infty} \frac{1}{2k}$, it is always possible to find for any term a certain number of subsequent terms whose sum is greater than that term.
So I'm wondering, is there anything to this? Is this principle the difference between a series converging or diverging?