145
$\begingroup$

What is an intuitive meaning of the null space of a matrix? Why is it useful?

I'm not looking for textbook definitions. My textbook gives me the definition, but I just don't "get" it.

E.g.: I think of the rank $r$ of a matrix as the minimum number of dimensions that a linear combination of its columns would have; it tells me that, if I combined the vectors in its columns in some order, I'd get a set of coordinates for an $r$-dimensional space, where $r$ is minimum (please correct me if I'm wrong). So that means I can relate rank (and also dimension) to actual coordinate systems, and so it makes sense to me. But I can't think of any physical meaning for a null space... could someone explain what its meaning would be, for example, in a coordinate system?

Thanks!

$\endgroup$
2
  • 9
    $\begingroup$ Your statement "the rank R of a matrix as the minimum number of dimensions that a linear combination of its columns would have..." should be "the rank R of a matrix as the maximum number of dimensions that a linear combination of its columns would have...". The rank tells you the dimension of a space spanned by the columns. $\endgroup$
    – Tpofofn
    Commented Feb 11, 2011 at 2:01
  • 2
    $\begingroup$ @SalvadorDali, I have taken a look at your activity on MSE and it seems that it consists mostly of minor, insignificant edits. Please stop doing this, this is a kind of behaviour that we try to discourage here, on MSE. Thank you. $\endgroup$
    – Alex M.
    Commented Oct 8, 2016 at 20:31

11 Answers 11

140
$\begingroup$

If $A$ is your matrix, the null-space is simply put, the set of all vectors $v$ such that $A \cdot v = 0$. It's good to think of the matrix as a linear transformation; if you let $h(v) = A \cdot v$, then the null-space is again the set of all vectors that are sent to the zero vector by $h$. Think of this as the set of vectors that lose their identity as $h$ is applied to them.

Note that the null-space is equivalently the set of solutions to the homogeneous equation $A \cdot v = 0$.

Nullity is the complement to the rank of a matrix. They are both really important; here is a similar question on the rank of a matrix, you can find some nice answers why there.

$\endgroup$
6
  • 37
    $\begingroup$ Ohhhhhhhhhhhhhhhhhhhhhhhh the "loses their identity" part made so much sense! So that is why, when we reduce the dimensions of an m * n matrix, the number of vectors that don't lose their identity (the number of pivot columns) + the number of vectors that do (which is dim Null A) is just the total number of columns, n... thanks! It makes so much more sense now! :) $\endgroup$
    – user541686
    Commented Feb 9, 2011 at 7:41
  • 3
    $\begingroup$ I guess it is hard for the zero vector to loose its identity, especially under linear maps. However, is is always in the null space... $\endgroup$ Commented Dec 18, 2014 at 17:59
  • 4
    $\begingroup$ In a sense it does lose its identity, as it becomes equivalent to non-zero vectors, and so cannot be distinguished from them. $\endgroup$
    – milcak
    Commented Jun 18, 2015 at 17:24
  • 7
    $\begingroup$ I usually use the analogy of "getting squashed" by the transformation: the kernel (null-space) of a transformation are those vectors that are squashed into the other space, while the rank represents only those vectors that moved. One can also derive the fact that if you have a linear map between two vector spaces of different dimensions (domain>codomain), some must be squased, there just isn't enough space for them all. $\endgroup$
    – Andy
    Commented Dec 9, 2016 at 9:00
  • 6
    $\begingroup$ Take the projection on first dimension $A = [1, 0]$. Both vectors $(0, 19)$ and $(0, 333)$ are $0$s in the target. How can you "undo" this zero to get the 19 and 333 back? Once mapped by the transformation, they become the same - lose their identity. $\endgroup$
    – milcak
    Commented Oct 12, 2019 at 5:33
53
$\begingroup$

This is an answer I got from my own question, it's pretty awesome!

Let's suppose that the matrix A represents a physical system. As an example, let's assume our system is a rocket, and A is a matrix representing the directions we can go based on our thrusters. So what do the null space and the column space represent?

Well let's suppose we have a direction that we're interested in. Is it in our column space? If so, then we can move in that direction. The column space is the set of directions that we can achieve based on our thrusters. Let's suppose that we have three thrusters equally spaced around our rocket. If they're all perfectly functional then we can move in any direction. In this case our column space is the entire range. But what happens when a thruster breaks? Now we've only got two thrusters. Our linear system will have changed (the matrix A will be different), and our column space will be reduced.

What's the null space? The null space are the set of thruster intructions that completely waste fuel. They're the set of instructions where our thrusters will thrust, but the direction will not be changed at all.

Another example: Perhaps A can represent a rate of return on investments. The range are all the rates of return that are achievable. The null space are all the investments that can be made that wouldn't change the rate of return at all.

Another example: room illumination. The range of A represents the area of the room that can be illuminated. The null space of A represents the power we can apply to lamps that don't change the illumination in the room at all.

-- NicNic8

$\endgroup$
17
$\begingroup$

Imagine a set of map directions at the entrance to a forest. You can apply the directions to different combinations of trails. Some trail combinations will lead you back to the entrance. They are the null space of the map directions.

$\endgroup$
2
  • 1
    $\begingroup$ Could you describe what the A matrix would look like in this case? $\endgroup$
    – J. Doe
    Commented Dec 8, 2020 at 21:58
  • $\begingroup$ @J.Doe The columns of the associated matrix A would be the map of directions. The combinations of trails would be the vector v. $\endgroup$
    – chevestong
    Commented Nov 5, 2023 at 18:19
12
$\begingroup$

I find the easiest way to visualise null space is to consider a matrix mapping which represents the mapping of a vector to its shadow on $y=0$ from a fixed light source which is far away.

The null space of this mapping is a vector pointing directly towards the light source, because the vector representing its shadow in $y=0$ will be $0$. Generally, we can see from this example that for some mappings there will exist vectors which will always be mapped to $0$.

We can also see how other vectors undergo a non-invertible mapping. That is, from the shadow the best you could reconstruct the original vector would be up to an ambiguity in the direction of the null space.

(This is also good because you can do it on a table with a pen)

enter image description here

$\endgroup$
2
  • $\begingroup$ The only drawback here is that it is not a linear map. It becomes linear if we move the light source (the lamp) infinitely far away so that the rays becomes parallel, and consider the Sun instead of the lamp. The parallel rays make it very illustrative that each shadow is described by a line that is parallel to the null space, that is the null space characterizes the map in this sense. $\endgroup$
    – A.Γ.
    Commented Feb 23, 2021 at 17:56
  • 1
    $\begingroup$ Good point - I've updated the diagram :) $\endgroup$
    – BHC
    Commented Feb 23, 2021 at 19:01
9
$\begingroup$

The rank $r$ of a matrix $A \in \mathbb{R}^{m \times n}$, as you have said is the dimension of the column space ($r$ is also the dimension of the row space as well) i.e. the dimension of the space spanned by vectors which are obtained by a linear combination of the columns of $A$, equivalently the range of $A$. (The use of the word "minimum" in the question is unnecessary). However each column vector has $m$ components and the vectors in the range of $A$ has $m$ components as such but span only a $r (\leq m)$ dimensional subspace instead of a $m$ dimensional space. So we are missing out spanning the remaining $m-r$ dimensional subspace of the $m$ dimensional space.

The left null-space now plays the roll of spanning the remaining $m-r$ dimensional subspace. This is why the left null-space is orthogonal to the column space. So the left null-space along with the column space now spans the entire $m$ dimensional space i.e. if $C = \{y \in \mathbb{R}^{m \times 1}: y = Ax\text{ for some }x \in \mathbb{R}^{n \times 1} \}$ and $Z_L = \{z \in \mathbb{R}^{m \times 1}:z^T A = 0 \}$,

then $Z_L \cup C = \mathbb{R}^{m}$ and $Z_L \perp C$

The right null-space plays the analogous roll for the rows. The rows span only a $r$ dimensional subspace of the $n$ dimensional space. The right null-space now plays the roll of spanning the remaining $n-r$ dimensional subspace. This is why the right null-space is orthogonal to the row space. So the right null-space along with the row space now spans the entire $n$ dimensional space i.e. if $R = \{y \in \mathbb{R}^{n \times 1}: y = A^Tx\text{ for some }x \in \mathbb{R}^{m \times 1} \}$ and $Z_R = \{z \in \mathbb{R}^{n \times 1}: Az = 0 \}$,

then $Z_R \cup R = \mathbb{R}^{n}$ and $Z_R \perp R$

$\endgroup$
1
  • 2
    $\begingroup$ This is a good mathematical explanation, but it's not really intuitive for me. (It's probably just me, not your explanation.) So far, I've tended to think of linear algebra as a tool for figuring out the number of independent variables (slash, coordinates) in an equation (or matrix), so putting it in terms of that would be more intuitive to me than just a purely mathematical definition of rows and columns. It makes me intuitively see answers without worrying about vocabulary. But +1, nice explanation anyhow. :) $\endgroup$
    – user541686
    Commented Feb 9, 2011 at 7:47
0
$\begingroup$

Think of an observer and n number of speakers at different distance and in directions. Now make a matrix of equations for sound from each speaker, based on contribution of their amplitude, frequencies and phase. Null space will be formed of all possible combination that you can set in a way that, the total/superimposed sound at observer location will be zero. Means, observer will not hear anything even if the speakers are playing.

$\endgroup$
0
$\begingroup$

Its the solution space of the matrix equation $AX=0$ . It includes the trivial solution vector $0$. If $A$ is row equivalent to the identity matrix, then the zero vector is the only element of the solution space. If it is not i.e. when the column space of $A$ is of a dimension less than the number of columns of A then the equation $AX=0$ has non trivial solutions which form a vector space, whose dimension is termed nullity.

$\endgroup$
0
$\begingroup$

If your matrix is A (doesn't have to be square, can be nxm) and has rank $r< min(n,m)$, then the null space is spanned by ${max(n,m)-r}$ orthogonal vectors and is the space orthogonal to the $span$(A) $\in \mathbb{R}^{max(n,m)}$ (i.e., the linear combination of the basis/orthogonal vectors $\in \mathbb{R}^{max(n,m)}$ that are orthogonal to the $r$ basis vectors of A). See rank-nullity theorem. In the simplest example, if $A=\left[\begin{array}{cc} 1&0\\ 0&0 \end{array}\right] $, then $span(A)=\alpha\left[\begin{array}{c} 1\\ 0 \end{array}\right], \alpha \in \mathbb{R}$ and $null(A)=\beta\left[\begin{array}{c} 0\\ 1 \end{array}\right], \beta \in \mathbb{R}$

Play around with "null" in base Matlab, or SVD in Python like in this answer, where it can be seen that any zero eigenvalues correspond to eigenvectors in the left matrix that span the null space (also see here)

$\endgroup$
0
$\begingroup$

In mechanical engineering, the example of $AX=B$ is found in the finite element method, where $KU=F$, in which $K$ is the stiffness matrix, $U$ is the displacement of nodes and $F$ represents lumped forces at the same nodes. While $KU=0$ as the $U$ isn't equal to the zero vector, it means that a system like a structure can move without any forces developed in, or it won't prevent displacement defined by $U$.

$\endgroup$
0
$\begingroup$

There is another perspective to the NullSpace of matrix AX = 0

  • We can think of X vector as kind of orthogonal vector to all the vectors present in the matrix A
  • NullSpace of matrix A can be defined as the set of all possible orthogonal vectors to the matrix A
$\endgroup$
0
$\begingroup$

Let an m by n matrix represent a set of movie-person preference data; n columns representing n different movies, m rows representing m different persons, and each entry shows a particular person's preference value of a particular movie.

Column space: a set of different movies with unique fandom, that is preference profile. Row space: a world of movie-viewers, each person with different (possibly independent) taste.

Then the null space (kernal) can be thought of as a possible movie that is nat (and can not be) liked by any person. The left null space would be an essentric person with a weird taste that cannot be satisfied any existing movie or any future movies that can be made independently from the existing movies. Those movies and persons, however, do exist in the entire movie-person space.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .