7
$\begingroup$

Is it possible to draw some parallels between the Wronskian and the Gram matrix? Could they be used for solving the same problem? What is the principal difference between them?

The Gram matrix of a set of vectors $v_{1},\cdots ,v_{n}$ in an inner product space is the Hermitian matrix of inner products, whose entries are given by $G_{ij}=\langle v_{i},v_{j}\rangle$. A set of vectors is linearly independent if and only if the Gram determinant (the determinant of the Gram matrix) is non-zero.

For $n-1$ times differentiable functions $f_1, \cdots , f_n$ the Wronskian $W(f_1, \cdots , f_n)$ as a function is defined by $$ {W(f_{1},\ldots ,f_{n})(x)={\begin{vmatrix}f_{1}(x)&f_{2}(x)&\cdots &f_{n}(x)\\f_{1}'(x)&f_{2}'(x)&\cdots &f_{n}'(x)\\\vdots &\vdots &\ddots &\vdots \\f_{1}^{(n-1)}(x)&f_{2}^{(n-1)}(x)&\cdots &f_{n}^{(n-1)}(x)\end{vmatrix}}} $$ If the functions $f_i$ are linearly dependent, then so are the columns of the Wronskian as differentiation is a linear operation, so the Wronskian vanishes.

$\endgroup$

1 Answer 1

4
$\begingroup$

Both constructions are of the form $\det(D)$ for

$$ D = \begin{pmatrix} \varphi_1(v_1) & \varphi_1(v_2) & \dots & \varphi_1(v_n) \\ \vdots & \vdots & \ddots & \vdots \\ \varphi_n(v_1) & \varphi_n(v_2) & \dots & \varphi_n(v_n) \end{pmatrix}. $$

where $v_1,\dots,v_n \in V$ are vectors and $\varphi_i \in V^{*}$ are linear functionals on $V$. For the first case, we have $\varphi_i(v) = \left< v, v_i \right>$ while in the second case we have $\varphi_i(f) = f^{(i-1)}(x)$. Another example of this construction is the (transpose of the) Vandermonde determinant for which $v_1 = 1, v_2 = x, \dots, v_n = x^{n-1}$ and $\varphi_i(f) = f(\alpha_i)$.

Set $U = \operatorname{span} \{ v_1, \dots, v_n \}$. This construction is useful for various reasons:

  1. If $v_1, \dots, v_n$ are linearly dependent, then $\det D = 0$. This gives us a sufficient (but in general not necessary) condition for checking linear independence between vectors. The reason that if we have a non-trivial relation $\sum_{i=1}^n a_i v_i = 0$ we can apply the functionals $\varphi_j$ to it and get $n$ linear relations $$ \varphi_j \left( \sum_{i=1}^n a_i v_i \right) = \sum_{i=1}^n a_i \varphi_j(v_i) = 0 $$ for all $1 \leq j \leq n$. In matrix notation, we have $$ a_1 \begin{pmatrix} \varphi_1(v_1) \\ \vdots \\ \varphi_n(v_1) \end{pmatrix} + \dots + a_n \begin{pmatrix} \varphi_1(v_n) \\ \vdots \\ \varphi_n(v_n) \end{pmatrix} = \begin{pmatrix} 0 \\ \vdots \\ 0 \end{pmatrix} $$ which implies that the columns of $D$ are linearly dependent so $\det D = 0$. Hence, if $\det D \neq 0$, the vectors $v_1,\dots,v_n$ are linearly independent. For example, one can use this and the calculation of the Vandermonde determinant or the Wronskian to show that the functions $1,x,\dots,x^{n-1}$ are linearly independent.
  2. If $\det D \neq 0$, the vector space $U$ is $n$-dimensional and given $y_1,\dots,y_n \in \mathbb{F}$ we can find a unique $v \in U$ such that $\varphi_i(v) = y_i$. For the Vandermonde determinant, this proves that if $\alpha_i \neq \alpha_j$ for $i \neq j$ we can find a unique polynomial of degree $\leq n$ that passes through the points $(\alpha_1, y_1), \dots, (\alpha_n, y_n)$. For the Wronskian, this means we can find a unique function $f$ which is a linear combination of $f_1,\dots,f_n$ which satisfies $f(x) = y_1, f'(x) = y_2, \dots, f^{(n-1)}(x) = y_n$ which is important for setting initial conditions for a linear ODE. For the Gram matrix, this means that we can specify a vector in $U$ by specifying its projections on $v_1,\dots,v_n$.
  3. By duality, if $\det D \neq 0$ then not only the $v_i$ are linearly independent but also the linear functionals $\varphi_i$. For the Vandermonde matrix, this proves that the evaluation functionals at distinct points are linearly independent. This also proves that $\varphi_1|_{U}, \dots, \varphi_n|_{U}$ form a basis for $U^{*}$. This is used for example in constructing numerical integration schemes which are exact for polynomials up to a certain degree.

Beyond this similarity, there are important differences between the various cases:

  1. For the Gram matrix, we have $\det D = 0$ if and only if the vectors $v_1,\dots,v_n$ are linearly dependent. The matrix $D$ is always Hermitian (even positive semidefinite).
  2. For the Wronskian, we might have $\det D = 0$ and yet $v_1,\dots,v_n$ are linearly independent. In addition, the matrix $D$ is not in general Hermitian.
$\endgroup$
2
  • $\begingroup$ Brilliant answer, thank you very much! $\endgroup$
    – Konstantin
    Commented Jun 6, 2017 at 19:00
  • $\begingroup$ @A.P.: Fixed, thanks! $\endgroup$
    – levap
    Commented Apr 20, 2020 at 13:31

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .