32
$\begingroup$

Do we ever put functions as entries of a matrix? If so, are these matrices used in linear algebra or do they have some other special use?

There have been minor not neccessarily conflicts per se, but disagreements on the nature of this question and so I am adding a little statement below to clear this up.

I have noticed that "function" has been interpreted two ways within the answers.

  1. An actual raw function such as merely writing "f". Such a concept is beyond my current understanding (unless I am being stupid somehow), but it is interesting nonetheless.

  2. A function call returning a value. This is primarily what I meant in my post.

Either one of these is valid. In fact, I think the broadness of this question dictates the fact that people will interpret it differently. In essence, your mileage will vary, and both are so similar from my standpoint that they are all good answers (or good examples if not standalone answers).

$\endgroup$
8
  • 13
    $\begingroup$ en.wikipedia.org/wiki/Wronskian $\endgroup$
    – vadim123
    Commented Jun 23, 2016 at 19:10
  • 1
    $\begingroup$ Not to mention, when you have complex eigenvalues (depending on the how many complex eigenvalues you have), you will have trig functions in your eigenvectors. $\endgroup$
    – user322313
    Commented Jun 23, 2016 at 19:21
  • 1
    $\begingroup$ @vadim, and then there's also the Casoratian. $\endgroup$ Commented Jun 23, 2016 at 20:56
  • $\begingroup$ Another example I can think of is a Slater Matrix, very often used in Quantum Mechanics of small/medium size systems. $\endgroup$
    – TheVal
    Commented Jun 24, 2016 at 21:48
  • 2
    $\begingroup$ Looks like a quick read of the function page on wikipedia might help clarify things... $\endgroup$
    – user121330
    Commented Jun 27, 2016 at 17:59

8 Answers 8

43
$\begingroup$

You can define a matrix with elements in any commutative ring, since the only requirement is to be able to perform addition and multiplication with the usual properties.

You even may consider the following $2\times 2$ matrices, with elements that do not belong to the same sets. Such matrices describe the endomorphisms of the direct sum $\;E=U\oplus V$ of two vector spaces $U$ and $V$ $$M=\begin{bmatrix} f_1&f_2\\g_1&g_2\end{bmatrix},\quad\text{where}\quad\begin{array}{|ll} f_1\in \mathcal L(U,U),& f_2\in \mathcal L(U,V),\\ g_1\in \mathcal L(V,U),& g_2\in \mathcal L(V,V). \end{array}$$ You can check one can multiply two such matrices, multiplication of elements being composition of linear maps.

$\endgroup$
13
  • 7
    $\begingroup$ +1 I like your answer cause It is the only one who actually answer the question; matrix with functions as entries is not the same thing that functions into some matrix space!! Maybe it would be noteworty to mention in the answer to future readers. $\endgroup$
    – user335721
    Commented Jun 23, 2016 at 21:56
  • $\begingroup$ Yep. In fact, the ring can be a semiring (with $0$ and $1$), and it doesn't need to be commutative. For instance, my guess is that non-commutative $\mathbb{B}$-algebras have widespread use in decision theory / game theory, in relation to the question: "Is it possible to do the right moves in the right order to get from one state to another?" Obviously, the order in which the player(s) perform moves usually matters, so this is one example where matrices over a non-commutative semiring are actually very natural. I think its a bit lame that most linear algebra books don't do this kind of thing. $\endgroup$ Commented Jun 24, 2016 at 1:12
  • 1
    $\begingroup$ I like the point about matrices with elements in any ring -- which I think is the only correct answer to the question -- but the heterogenous matrix in the second paragraph seems to me to be more confusing than helpful to me. I suppose that's how morphisms between product objects in an abelian category look, but how much of ordinary linear algebra generalizes to that setting is unclear. $\endgroup$ Commented Jun 24, 2016 at 9:41
  • 1
    $\begingroup$ @TheGreatDuck [sorry, missing dollar]...of an arbitrary element, which os what $f(x)$ really is $\endgroup$
    – user335721
    Commented Jun 27, 2016 at 1:29
  • 2
    $\begingroup$ @TheGreatDuck Now, the jacobian $Jf=(\partial_if_j)_{1\leq i,j\leq n}$ is not a matrix whose entries are functions because they are not evaluated separately. Given a point $x$ the jacobian of $f$ returns a nxn matrix so it really is a function from $\mathbb{R}^n$ in the matrix space $\mathcal{M}(\mathbb{R})$. Their entries are not functions, just are the image of functions. $\endgroup$
    – user335721
    Commented Jun 27, 2016 at 1:42
28
$\begingroup$

One common use of functions in a matrix is the Hessian matrix in multivariable calculus. This is a matrix of second derivatives with respect to $x_1, x_2, \ldots$.

$$ M = \pmatrix{ \frac{\partial^2 f}{\partial^2 x_1} & \frac{\partial^2 f}{\partial x_1 \partial x_2} & \cdots \\ \frac{\partial^2 f}{\partial x_2 \partial x_1} & \frac{\partial^2 f}{\partial^2 x_2} & \cdots \\ \vdots & \vdots & \ddots }$$

The eigenvalues of this matrix says a lot about the nature of the function (minimum, maximum, saddle point) and other properties such as stability.

$\endgroup$
4
  • 5
    $\begingroup$ The Hessian matrix is not a matrix of functions, is a matrix whose entries are defined by functions, but every matrix can be thought that way by understanding constants as functions. $\endgroup$
    – user335721
    Commented Jun 23, 2016 at 21:49
  • 1
    $\begingroup$ @John no, it is a.matrix of functions, the partial derivatives are functions, and the hessian matrix will vary from point to point accordingly $\endgroup$ Commented Jun 24, 2016 at 14:40
  • 2
    $\begingroup$ @user2520938: well, one might better say the Hessian is a single matrix-valued function. The distinction between such things as product-of-functions and function-whose-result-is-a-product is often treated a bit sloppily, even by mathematicians. In case of the Hessian matrix it's fortunately equivalent, although it can be a big difference in practice, espaecially when doing numerics – the computional expense if you actually try to calculate each Hessian entry individually as a function is considerable. $\endgroup$ Commented Jun 24, 2016 at 15:13
  • $\begingroup$ Actually both a matrix of functions and a matrix-valued function are just different ways of currying a function $M\times N\times D\to C$ where $M,N$ are the row/column index sets, $D$ is the function domain and $C$ is the codomain. $\endgroup$
    – celtschk
    Commented Jul 18, 2016 at 9:58
10
$\begingroup$

Do we ever put functions as entries of a matrix?

Yes. There have been quite a few fantastic examples given, but I'm not sure they get to the heart of your question.

If so, are these matrices used in linear algebra or do they have some other special use?

The key is not that matrices with functions (or functionals or operators or vectors or matrices etc...) as entries are used in Linear Algebra, it's that we may use Linear Algebra whenever the collection of mathematical objects satisfy certain requirements.

In linear algebra class, they use numbers as entries in the matrices and vectors, but endeavor to view those matrices and vectors as abstract objects. Using numbers as entries provides context and builds intuition, much like learning arithmetic before algebra.

As for the special uses, I can't think of any common thread between all of the different uses for matrices and vectors with functions as entries. There are quite a few.

$\endgroup$
4
  • 2
    $\begingroup$ I'm not sure what you mean by 'used in linear algebra'. Linear algebra is a tool used to solve problems, and you're kind of asking if we ever use a particular thing to tool, when you should be asking whether one can use the particular tool to solve problems, or whether this particular thing can be used with the tool to solve problems. $\endgroup$
    – user121330
    Commented Jun 27, 2016 at 15:05
  • $\begingroup$ How can you not understand "used in linear algebra". I'm saying that putting functions as arguments could be a technique within linear algebra itself (for instance in forming a function returning a matrix). How is that confusing? $\endgroup$
    – user64742
    Commented Jun 27, 2016 at 15:23
  • $\begingroup$ Thank you so much for trying to clarify. Perhaps I'd understand better if you wrote down an example with math instead of words? $\endgroup$
    – user121330
    Commented Jun 27, 2016 at 17:52
  • 3
    $\begingroup$ @TheGreatDuck: 'linear algebra' isn't a thing that people study independently. Your question is met with confusion because it's like asking if there are 'tools you can use on a hammer'. I mean, sure, building a hammer is easier with tools, but the question just sounds backwards. $\endgroup$ Commented Jun 29, 2016 at 16:31
10
$\begingroup$

Even just the regular first derivative of a function $$ f:\mathbb{R}^n\rightarrow \mathbb{R}^m$$ Is a matrix of functions, if you define $f$ by its $m$ component scalar functions from $\mathbb{R}^n\rightarrow \mathbb{R}$, or as $$f(\vec{x})=(f_1(\vec{x}),....,f_m(\vec{x})) $$ with $$ \vec{x}=(x_1,....,x_n)$$ then $$ Df(\vec{x})=\begin{bmatrix}\frac{\partial f_1}{\partial x_1}&......&\frac{\partial f_1}{\partial x_n}\\ \frac{\partial f_2}{\partial x_1}&......&\frac{\partial f_2}{\partial x_n}\\ \vdots&&\vdots\\ \frac{\partial f_m}{\partial x_1}&......&\frac{\partial f_m}{\partial x_n} \end{bmatrix} $$ Each of the entires above, the partials of the component functions of $f$, are themselves bonafide functions. Or, if you've done some multivariable calculus, the rows are the gradients of the component functions.

This is what makes linear algebra crucial to studying multivariable calculus. If you want to talk about linear approximations of vector valued functions in something more than the abstract, you have to be comfortable with matrices.

$\endgroup$
9
$\begingroup$

A very common example is the matrix of a plane rotation with angle$\theta$ around the origin:

$$\begin{bmatrix}\cos \theta & -\sin \theta\\ \sin \theta & \cos \theta \end{bmatrix}$$

$\endgroup$
8
  • $\begingroup$ In the context of numerical linear algebra, rotation matrices are sometimes referred to as Givens matrices. $\endgroup$ Commented Jun 23, 2016 at 21:04
  • 3
    $\begingroup$ That is a matrix of numbers, not a matrix of functions. The whole thing can be considered as a function of $\theta$, of course, but that doesn't mean the matrix entries are functions. $\endgroup$ Commented Jun 24, 2016 at 15:28
  • 1
    $\begingroup$ @leftaroundabout - if the whole thing is considered as a function of $\theta$, then it exactly means that it is a matrix of functions. A matrix of functions and a function with matrix values are just two ways of viewing the exact same object. $\endgroup$ Commented Jun 24, 2016 at 17:36
  • $\begingroup$ @PaulSinclair: it so happens that you can construct an isomorphism between matrix-valued functions and function-containing matrices (it's even canonical), but that doesn't mean they're the same thing. $\endgroup$ Commented Jun 24, 2016 at 17:50
  • 1
    $\begingroup$ @leftaroundabout - The interpretation of this example as a function into matrices instead of a matrix of functions is one of your own making. It is something you yourself imposed on this post, then felt the need to berate JeanMarie for, even though it is not in any way required or even suggested by the post itself. And the only difference between the two concepts is the context into which they have been placed. The actual behavior is the same. This is what that "canonical" ("natural" is a more appropriate term) map is telling you. The only difference is semantics. $\endgroup$ Commented Jun 24, 2016 at 18:22
8
$\begingroup$

They exist, for example you could have this linear transformation:

$L: \mathbb{P}_2\rightarrow M_{2\times 2}, f\rightarrow L(f):=\begin{pmatrix} f'(0) & f(1) \\ \int_{-1}^1f(s)\,ds& 0 \\ \end{pmatrix}$

Functional analysis studies this kind of matrices, I think. Here $\mathbb{P}_2$ represents the set of all polynomial functions from $\mathbb{R}$ to $\mathbb{R}$, of degree at most 2. The matrix representing this linear transformation is a matrix made of functions appied to some values.

Another example could be the Jacobian matrix, or the Gradient Vector (a vector, but can be seen as a matrix as well)

$\endgroup$
6
$\begingroup$

There is the Maurer-Cartan form which is $\mathfrak g$-valued. An example is if we consider the Lie subgroup $G = SO(2) \subset GL(2,\mathbb R) $, we may parametrize $SO(2)$ by $$g(\theta)= \begin{pmatrix}\cos \theta & -\sin\theta \\\sin \theta & \cos \theta\end{pmatrix}\,\, , \theta \in \mathbb R$$

then the matrix of forms (which are functions) is given by

$$\omega_G = g^{-1}dg = \begin{pmatrix}0 & -d\theta \\ d \theta & 0\end{pmatrix}$$

$\endgroup$
0
$\begingroup$

Here's an example from the study of differential equations:

The Wronskian is the determinant of a matrix in which the $i$-th row represents the $i$-th derivative of $n$ functions. If the determinant is non-zero during a given interval, it demonstrates that the functions are linearly independent along that interval. This is a useful property to know for numerous reasons.

For example, say you have a second order, linear, homogeneous differential equation, and furthermore say you have found two solutions, $y_1$ and $y_2$, that satisfy the equation. Now let's say you want to find a specific solution given initial conditions. If you can demonstrate that $y_1$ and $y_2$ are linearly independent, then you also know that there exist $c_1$ and $c_2$ such that $y(t) = c_1(y_1) + c_2(y_2)$, given initial conditions $y(t_0) = y_0$ and $y'(t_0) = y_0'$. This is because, when you solve for $c_1$ and $c_2$, the Wronskian ends up in the denominator. If $y_1$ and $y_2$ are linearly dependent (getting back to your question about using linear algebra on matrices with functions), your Wronskian will be zero, and $c_1$ or $c_2$ will not be defined. Thus, linear independence of solutions to a second order, linear, homogeneous differential equation helps demonstrate that a specific solution can be found for a given set of initial conditions.

See here.

$\endgroup$
1
  • $\begingroup$ Welcome to MSE. Please edit and use MathJax to properly format math expressions. $\endgroup$ Commented Feb 6, 2020 at 19:58

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .