Both constructions are of the form $\det(D)$ for
$$ D = \begin{pmatrix} \varphi_1(v_1) & \varphi_1(v_2) & \dots & \varphi_1(v_n) \\
\vdots & \vdots & \ddots & \vdots \\
\varphi_n(v_1) & \varphi_n(v_2) & \dots & \varphi_n(v_n) \end{pmatrix}. $$
where $v_1,\dots,v_n \in V$ are vectors and $\varphi_i \in V^{*}$ are linear functionals on $V$. For the first case, we have $\varphi_i(v) = \left< v, v_i \right>$ while in the second case we have $\varphi_i(f) = f^{(i-1)}(x)$. Another example of this construction is the (transpose of the) Vandermonde determinant for which $v_1 = 1, v_2 = x, \dots, v_n = x^{n-1}$ and $\varphi_i(f) = f(\alpha_i)$.
Set $U = \operatorname{span} \{ v_1, \dots, v_n \}$. This construction is useful for various reasons:
- If $v_1, \dots, v_n$ are linearly dependent, then $\det D = 0$. This gives us a sufficient (but in general not necessary) condition for checking linear independence between vectors. The reason that if we have a non-trivial relation $\sum_{i=1}^n a_i v_i = 0$ we can apply the functionals $\varphi_j$ to it and get $n$ linear relations
$$ \varphi_j \left( \sum_{i=1}^n a_i v_i \right) = \sum_{i=1}^n a_i \varphi_j(v_i) = 0 $$
for all $1 \leq j \leq n$. In matrix notation, we have
$$ a_1 \begin{pmatrix} \varphi_1(v_1) \\ \vdots \\ \varphi_n(v_1) \end{pmatrix} + \dots + a_n \begin{pmatrix} \varphi_1(v_n) \\ \vdots \\ \varphi_n(v_n) \end{pmatrix} = \begin{pmatrix} 0 \\ \vdots \\ 0 \end{pmatrix} $$
which implies that the columns of $D$ are linearly dependent so $\det D = 0$. Hence, if $\det D \neq 0$, the vectors $v_1,\dots,v_n$ are linearly independent. For example, one can use this and the calculation of the Vandermonde determinant or the Wronskian to show that the functions $1,x,\dots,x^{n-1}$ are linearly independent.
- If $\det D \neq 0$, the vector space $U$ is $n$-dimensional and given $y_1,\dots,y_n \in \mathbb{F}$ we can find a unique $v \in U$ such that $\varphi_i(v) = y_i$. For the Vandermonde determinant, this proves that if $\alpha_i \neq \alpha_j$ for $i \neq j$ we can find a unique polynomial of degree $\leq n$ that passes through the points $(\alpha_1, y_1), \dots, (\alpha_n, y_n)$. For the Wronskian, this means we can find a unique function $f$ which is a linear combination of $f_1,\dots,f_n$ which satisfies $f(x) = y_1, f'(x) = y_2, \dots, f^{(n-1)}(x) = y_n$ which is important for setting initial conditions for a linear ODE. For the Gram matrix, this means that we can specify a vector in $U$ by specifying its projections on $v_1,\dots,v_n$.
- By duality, if $\det D \neq 0$ then not only the $v_i$ are linearly independent but also the linear functionals $\varphi_i$. For the Vandermonde matrix, this proves that the evaluation functionals at distinct points are linearly independent. This also proves that $\varphi_1|_{U}, \dots, \varphi_n|_{U}$ form a basis for $U^{*}$. This is used for example in constructing numerical integration schemes which are exact for polynomials up to a certain degree.
Beyond this similarity, there are important differences between the various cases:
- For the Gram matrix, we have $\det D = 0$ if and only if the vectors $v_1,\dots,v_n$ are linearly dependent. The matrix $D$ is always Hermitian (even positive semidefinite).
- For the Wronskian, we might have $\det D = 0$ and yet $v_1,\dots,v_n$ are linearly independent. In addition, the matrix $D$ is not in general Hermitian.