13
$\begingroup$

I am taking a course in ODE, and I got a homework question in which I am required to:

  • Calculate the Wronskians of two function vectors (specifically $(t, 1)$ and $(t^{2}, 2t)$).
  • Determine in what intervals they are linearly independent.

There are more parts to this question, but I figured that I will deal with them when I understand the basic concepts better. So these are my questions:

  • I know how to calculate the Wronskian of n functions: $f_{1} \cdots f_{n}: \begin{vmatrix} f_{1} & \cdots & f_{n} \\ \vdots & & \vdots \\ f_{1}^{(n-1)} & \cdots & f_{n}^{(n-1)} \end{vmatrix}$. I assume that when I'm asked to calculate the Wronskian of a function vector, my $n$ functions would be the vector's components?
  • I know that if the Wronskian of $n$ functions is not $0$ for some $t$, I can deduce that they are linearly independent. How can I use this information to find the intervals in which two vectors are independent?

I would love to read a good explanation on why these methods work (sadly, I cannot understand a thing from my notebook and the library is closed on the weekend), so if you could explain it or direct me to a good online resource, preferably not Wikipedia, I will be glad.

And finally, I apologize in advance if I'm not very clear, I am not a native English speaker.

Thanks!

$\endgroup$
10
  • 1
    $\begingroup$ The set of solutions (scalar functions) to a linear differential equations is a vector space of dimention $n$, where $n$ is the order of your differential equation. So scalar functions are vectors in this vector space. The theorem that links the Wronskian determinant and linear indipendence refers to vectors in this sense. $\endgroup$ Commented Jan 6, 2012 at 23:41
  • 1
    $\begingroup$ @Emilio: If the question is reproduced correctly, that can't be all there is to it -- the part "Determine in what intervals they are linearly independent" wouldn't make sense under that interpretation. I think it's really asking for which values of $t$ the two vectors $(t,1)$ and $(t^2,2t)$ are linearly independent. $\endgroup$
    – joriki
    Commented Jan 7, 2012 at 0:11
  • 2
    $\begingroup$ @Hila: I just realized that the "function vectors" you gave are such that the second component is the derivative of the first. So it seems that these are "function vectors" in the sense that a higher-order ODE can be viewed as a first-order ODE for a vector containing the function and its derivatives. If that's what's meant, then you have $f_1=t$, $f_1'=1$, $f_2=t^2$, $f_2'=2t$, and the Wronskian is simply $$\left|\begin{array}{cc}t&t^2\\1&2t\end{array}\right|=t^2\;.$$ $\endgroup$
    – joriki
    Commented Jan 7, 2012 at 0:40
  • 1
    $\begingroup$ @joriki I see what you mean. I am familiar with this result: if $f_1,...,f_n$ are solutions to a linear differential equation of order $n$, defined in $(a,b)$, then: (1) $\exists x_0 \in (a,b): W(x_0)=0$ iff $f_1,...,f_n$ are linearly dependent; (2) $\exists x_0 \in (a,b): W(x_0) \neq 0$ iff $f_1,...,f_n$ are linearly independent. Maybe this is true for all functions in general (not necessarily solutions to linear differential equations), but I'm not sure. $\endgroup$ Commented Jan 7, 2012 at 0:46
  • 1
    $\begingroup$ Also: for any system of $n$ first order linear ODEs there is a single linear ODE of order $n$ to which it is equivalent. $\endgroup$ Commented Jan 7, 2012 at 0:52

3 Answers 3

25
$\begingroup$

Let me address why the Wronskian works. To begin let's use vectors of functions (not necessarily solutions of some ODE).

For convenience, I'll just work with $3 \times 3$ systems.

Let $$ {\bf f}_1(t) = \begin{bmatrix} f_{11}(t) \\ f_{21}(t) \\ f_{31}(t) \end{bmatrix}, \qquad {\bf f}_2(t) = \begin{bmatrix} f_{12}(t) \\ f_{22}(t) \\ f_{32}(t) \end{bmatrix}, \qquad \mathrm{and} \qquad {\bf f}_3(t) = \begin{bmatrix} f_{13}(t) \\ f_{23}(t) \\ f_{33}(t) \end{bmatrix} $$ be vectors of functions (i.e. functions from $\mathbb{R}$ to $\mathbb{R}^3$).

We say the set $\{ {\bf f}_1(t), {\bf f}_2(t), {\bf f}_3(t) \}$ is linearly dependent on $I \subseteq \mathbb{R}$ (some set of real numbers) if there exist $c_1,c_2,c_3 \mathbb{R}$ (not all zero) such that $c_1{\bf f}_1(t)+c_2{\bf f}_2(t)+c_3{\bf f}_3(t)={\bf 0}$ for all $t \in I$. [Be careful here: this equation must hold for all $t$'s in $I$ simultaneously with the same constants.]

This equation can be recast in terms of matrices. We have linear dependence if and only if there exists some constant vector ${\bf c} \not= {\bf 0}$ such that ${\bf F}(t){\bf c}={\bf 0}$ for all $t \in I$. This is where ${\bf F}(t) = [{\bf f}_1(t) \;{\bf f}_2(t) \;{\bf f}_3(t)]$. Or writing it out in a more expanded form:

$$ {\bf F}(t){\bf c} = \begin{bmatrix} f_{11}(t) & f_{12}(t) & f_{13}(t) \\ f_{21}(t) & f_{22}(t) & f_{23}(t) \\ f_{31}(t) & f_{32}(t) & f_{33}(t) \end{bmatrix} \begin{bmatrix} c_1 \\ c_2 \\ c_3 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 0 \end{bmatrix}$$

Now the determinant of ${\bf F}(t)$ is known as the Wronskian of the functions ${\bf f}_1,{\bf f}_2,{\bf f}_3$. That is $W(t) = \mathrm{det}({\bf F})(t)$.

Now we call on basic Linear algebra. The columns of an $n \times n$ matrix $A$ are linearly dependent if and only if there is a non-trivial (i.e. non-zero) solution of $A{\bf x}={\bf 0}$. This is true if and only if $\mathrm{det}(A)=0$.

But be very careful this is for a system of constants (not functions). To show the columns of ${\bf F}$ are linearly dependent we need there to be a non-zero solution for all $t$ in $I$.

So only the following can be said: IF the columns of ${\bf F}(t)$ are linearly dependent on $I$, THEN there is a non-zero solution for ${\bf F}(t){\bf c}={\bf 0}$ which works for all $t$ in $I$. Thus $W(t)=\mathrm{det}({\bf F})(t)=0$ for all $t$ in $I$.

The converse does not hold in general.

However, for sets of solutions of linear systems of ODEs, Abel's Identity shows that the Wronskian is a constant multiple of an exponential function. Thus if it's zero somewhere, it's zero everywhere (well, everywhere these are solutions anyway). So in intro to DEs classes, professors will commonly state that we have linear dependence if and only if the Wronskian zero and then proceed to use examples for which this theorem isn't necessarily true! The implication does not go both ways in general.

Now finally, how to connect this back to regular functions? Well, consider functions $f,g,h$. Then they are linearly dependent on some set of real numbers $I$ if we can find $a,b,c \in \mathbb{R}$ (not all zero) such that $af(t)+bg(t)+ch(t)=0$ for all $t$ in $I$. If we differentiate this again and again, we'll get other equations:

$$ \begin{array}{ccc} af(t)+bg(t)+ch(t) & = & 0 \\ af'(t)+bg'(t)+ch'(t) & = & 0 \\ af''(t)+bg''(t)+ch''(t) & = & 0 \end{array} $$

Now we're back to the discussion about linear independence in reference to the set: $$ \left\{ \begin{bmatrix} f(t) \\ f'(t) \\ f''(t) \end{bmatrix}, \begin{bmatrix} g(t) \\ g'(t) \\ g''(t) \end{bmatrix}, \begin{bmatrix} h(t) \\ h'(t) \\ h''(t) \end{bmatrix} \right\} $$

So the Wronskian you're using is a special case of the Wronskian for systems of ODEs.

I hope this clears up a little!

$\endgroup$
9
  • 1
    $\begingroup$ This is an excellent explanation, thank you! $\endgroup$
    – Hila
    Commented Jan 7, 2012 at 11:57
  • 2
    $\begingroup$ This is a lot clearer and more thorough than all the Wikipedia and MathWorld articles on the subject put together :-) $\endgroup$
    – joriki
    Commented Jan 7, 2012 at 12:20
  • 3
    $\begingroup$ Maybe I ought to go update those... $\endgroup$
    – Bill Cook
    Commented Jan 7, 2012 at 16:20
  • 2
    $\begingroup$ @BillCook Millions of desperate Math students will thank you... $\endgroup$
    – Hila
    Commented Jan 7, 2012 at 20:38
  • $\begingroup$ is this reflecting the fact that we can pass from nth order equations to system of n first order equations? doing this we get the derivatives in the vector functions! $\endgroup$
    – user123124
    Commented Aug 10, 2018 at 10:37
2
$\begingroup$

The problem seems to have been solved in the discussion in the comments: The "function vectors" are vectors containing the (zeroth and first) derivatives of a function, and thus the Wronskian is the determinant of the matrix formed of those vectors as columns.

$\endgroup$
0
$\begingroup$

I don't have rep yet, but I want to thank Bill! I was wondering this myself and (eventually) reasoned the same thing that he wrote. The forwards implication from Linear Algebra makes sense for linear independence, and what stalled me at the converse was this counter example:

$$ \left\{ \begin{bmatrix} t \\ t^3 \end{bmatrix}, \begin{bmatrix} t^2 \\ t^4 \end{bmatrix} \right\} $$

If we take the determinant of the matrix with these vectors as its collumns, its determinant is $t^5 - t^5 = 0$ for all times $t$. But, the set is linearly independent (sticking with the definition of finding a nontrivial solution to zero).

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .