5
$\begingroup$

Cross-posted on Scientific Computing Stack Exchange

Are there real-world applications that call specifically for eigenvalues rather than singular values?

Top eigenvalue is useful to establish convergence, but what about the rest?

I often see eigendecomposition used as "poor-man's SVD" For instance it's used in Matlab's Lyapunov solver, but that could be reformulated in terms of SVD with greater cost ($22n^3$ instead of $9n^3$, Higham's big six), while gaining numerical stability. Similarly, PCA can be done using SVD.

Picture below: two linear transformations below have the same eigenvalues:

enter image description here enter image description here

Notebook

$\endgroup$
2
  • 4
    $\begingroup$ Can’t say I’ve _ ever_ heard of eigenvalue decomposition being referred to as the poor man’s SVD. I guess I need to get out more! $\endgroup$ Commented Dec 18, 2023 at 22:50
  • $\begingroup$ There are DSP applications, such as blind source separation, state-space system identification, pole-placement in closed-loop feedback control, eigenmode extraction in vibrational analysis, beamforming.. and the list goes on. $\endgroup$
    – V.S.e.H.
    Commented Dec 19, 2023 at 21:42

4 Answers 4

8
$\begingroup$

One of the most important and widely used applications of eigenvalues specifically as opposed to singular values comes from from dynamical systems. Consider a linear ODE $$\dot{x} = Ax,$$ where $A$ is a diagonalizable $n\times n$ matrix. We then write its eigendecomposition and SVD as $A = P\Lambda P^{-1}$ and $A = U\Sigma V^*$. If we define $y = P^{-1}x$, then we have $$ \begin{aligned} \dot{x} = P\dot{y},&\quad Ax = P\Lambda y \\ \implies \dot{y} &= \Lambda y. \end{aligned} $$ Since $\Lambda$ is diagonal, this is easily solvable and, furthermore, we can determine whether the solution is rgowing, decaying, oscillating, etc. from the complex phase of the eigenvalues. This is something that is very difficult to determine from the singular value decomposition as all of the phase information is contained in the unitary matrices $U$ and $V$. Notice that trying to apply this same trick won't work with the SVD, as it relies on $A$ being similar to the diagonal matrix $\Lambda$, whereas the SVD does not rely on similarity.

This is used all the times in physics and engineering applications, as eigenvalues with real part greater than zero imply that something will be exponentially growing, such as spatial oscillations of a bridge or the reaction rate in a nuclear reactor.

$\endgroup$
2
  • $\begingroup$ Interesting. You need spectral radius to tell if discrete linear system is divergent, but using full eigendecomposition seems like an overkill. I'm thinking about the underlying issue of numerical instability -- spectral radius is numerically stable, but eigenvalues could be unstable, which means you wouldn't be able to compute them accurately. Do you need actual numerical values of eigenvalues, or is it enough to know their sign/realness + spectral radius? $\endgroup$ Commented Dec 18, 2023 at 22:56
  • 1
    $\begingroup$ It just depends on the size of your system and the application of interest. Eigenvalues are typically stable to compute as long as the matrix is diagonalizable. Eigendecomposition is nice to see which components will be growing/decaying/oscillating. In many physical problems, operators you are working with are approximations of infinite-dimensional operators and the eigenfunctions correspond to spatial modes that you will actually see in your experiments. This as well as some systems (like Boltzmann equation in reactor physics) are most naturally posed as eigenvalue problems. $\endgroup$
    – whpowell96
    Commented Dec 18, 2023 at 23:19
4
$\begingroup$

The eigenvalues of partial differential operators describing mechanical or electromagnetic systems are related to the resonance frequencies. For example, the frequencies at which a drum or guitar or string instrument vibrates are the square roots of the eigenvalues of the Laplace operator. The frequencies at which a building or bridge sways are the square roots of the eigenvalues of the linear elasticity operator. The frequencies at which an electromagnetic cavity (say, in your microwave oven, or in the particle accelerators used for medical cancer therapy devices) oscillates are the square roots of the eigenvalues of the Maxwell operator. There are many practical applications in which knowing these resonance frequencies is important, typically because you want that a device/instrument/building does or does not have specific resonant frequencies.

In order to compute the eigenvalues of these operators, you "discretize" them to obtain a finite-dimensional matrix, and then you compute the eigenvalues of this matrix. In many cases, these matrices have sizes ranging in the hundreds of thousands to the hundreds of millions.

$\endgroup$
2
  • $\begingroup$ The resonance frequencies of a building are the eigenvalues of a quadratic eigenvalue problem $(\lambda^2 M + \lambda D + K)x = 0$. Here $M$ is the mass matrix, $K$ is the stiffness matrix and $D$ is the damping matrix. If there is no damping $D$, then the problem reduces to $Kx = \lambda^2 M x$. Is this what you meant when you wrote "The frequencies at which a building or bridge sways are the square roots of the eigenvalues of the linear elasticity operator."? $\endgroup$ Commented Dec 19, 2023 at 16:45
  • $\begingroup$ Yes. The eigenvalue in the way you write it is $\lambda^2$, and its square root is the frequency we seek. It is true that what you need to solve is a generalized eigenvalue problem, but I thought that was a detail not relevant to the original question. $\endgroup$ Commented Dec 19, 2023 at 22:45
2
$\begingroup$

The energy levels available to a system (e.g. an atom, molecule, material, etc.) are the eigenvalues of the system's Hamiltonian matrix.

The following diagram which is presented to grade 9 (typically aged 13 to 15) students in the Ontario curriculum, shows labels four different energy levels, which correspond to the lowest four eigenvalues of the atom's Hamiltonian matrix:

~~ ~~ ~~ ~~ ~~ ~~enter image description here

Therefore, all of spectroscopy is about eigenvalues and the differences between them.

It is how we know that there's water on Mars, and CO2 on Venus and how we know the composition of stars and how we know the composition of the universe:

~~ ~~ enter image description here

We also use spectroscopy to check for pollutants in fuels, to check whether or not currency is counterfeit, and we use it in medical, geological, and atmospheric/climate applications among many, many other things.

The eigenvalues of the H atom within a non-relativistic model of the universe, are known analytically, but for larger atoms and for molecules, liquids, solids, etc., and even for relativistic modeling of the H atom, we almost always obtain eigenenvalues (energies) using numerical methods. For this exact reason, a chemist by the name of Ernest Davidson came up with one of the best ways to find the lowest eigenvalue of a matrix, and this is called the Davidson method. In only about 3.5 years, the word "eigenvalue" comes up 163 times on MMSE, so you can find a lot of real-world uses of eigenvalues.

$\endgroup$
3
  • $\begingroup$ This is also where the term "spectral theory" for eigenvalues/eigenvectors comes from. It originates with the spectrum as produced by a prism. $\endgroup$ Commented Dec 19, 2023 at 22:46
  • 2
    $\begingroup$ @WolfgangBangerth "The name spectral theory was introduced by David Hilbert in his original formulation of Hilbert space theory [...] The later discovery in quantum mechanics that spectral theory could explain features of atomic spectra was therefore fortuitous. Hilbert himself was surprised by the unexpected application of this theory, noting that "I developed my theory of infinitely many variables from purely mathematical interests, and even called it 'spectral analysis' without any presentiment that it would later find application to the actual spectrum of physics."" $\endgroup$ Commented Dec 19, 2023 at 23:21
  • $\begingroup$ Oh, that's interesting -- I had always assumed it to be (and learned that it was) connected. $\endgroup$ Commented Dec 20, 2023 at 18:00
0
$\begingroup$

I work primarily in the area of computer graphics, and I needed to compute an eigenvector just a few days ago!

When doing real-time cloth simulation, it might be useful to reconstruct the local coordinate system at a vertex as a rotated initial coordinate system of this vertex, which is essentially a 3D shape matching problem. If we describe the rotation as a quaternion, this reduces to finding the smallest value of a quadratic form on the unit sphere in 4 dimensions, or, equivalently, the smallest eigenvalue of a $4\times 4$ symmetric matrix.

enter image description here

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .