83
$\begingroup$

In a recent question, it was discussed how LA is a foundation to other branches of mathematics, be they pure or applied. One answer argued that linear problems are fully understood, and hence a natural target to reduce pretty much anything to.

Now, it's evident enough that such a linearisation, if possible, tends to make hard things easier. Find a Hilbert space hidden in your domain, obtain an orthonormal basis, and bam, any point/state can be described as a mere sequence of numbers, any mapping boils down to a matrix; we have some good theorems for existence of inverses / eigenvectors/exponential objects / etc..

So, LA sure is convenient.

OTOH, it seems unlikely that any nontrivial mathematical system could ever be said to be thoroughly understood. Can't we always find new questions within any such framework that haven't been answered yet? I'm not firm enough with Gödel's incompleteness theorems to judge whether they are relevant here. The first incompleteness theorem says that discrete disciplines like number theory can't be both complete and consistent. Surely this is all the more true for e.g. topology.

Is LA for some reason exempt from such arguments, or does it for some other reason deserve to be called the best understood branch of mathematics?

$\endgroup$
5
  • 4
    $\begingroup$ I would say a better way to describe this is to say that Linear Algebra has been studied more thoroughly than other regions of mathematics. $\endgroup$ Commented Mar 16, 2016 at 21:08
  • 1
    $\begingroup$ As I understood the question: what math can be reduced to LA and what has to be constructed without it (as a basis for LA)? Then at least mathematical logic (type/proof theory) is currently has to be built first. Generally, it's typical foundations question, whose theory is better, just a holy war; but I never heard that somebody proposed LA as the foundation. $\endgroup$
    – Andrew
    Commented Mar 16, 2016 at 21:17
  • $\begingroup$ There's not really anything left to do in linear algebra-- that is, the study of finite-dimensional vector spaces and maps between them. The generalization of it, which is very much a current active area of research, would be studying modules over general rings. $\endgroup$
    – anomaly
    Commented Mar 16, 2016 at 21:36
  • 1
    $\begingroup$ One of the remaining questions in the linear algebra of finite dimensional vector spaces is to determine the greatest rank of a tensor in a given tensor product. en.wikipedia.org/wiki/Tensor_rank_decomposition $\endgroup$ Commented Mar 17, 2016 at 13:08
  • 5
    $\begingroup$ Not by me it isn't. $\endgroup$
    – Tahlor
    Commented Mar 18, 2016 at 13:14

7 Answers 7

89
$\begingroup$

It's closer to true that all the questions in finite-dimensional linear algebra that can be asked in an introductory course can be answered in an introductory course. This is wildly far from true in most other areas. In number theory, algebraic topology, geometric topology, set theory, and theoretical computer science, for instance, here are some questions you could ask within a week of beginning the subject: how many primes are there separated by two? How many homotopically distinct maps are there between two given spaces? How can we tell apart two four dimensional manifolds? Are there sets in between a given set and its powerset in cardinality? Are various natural complexity classes actually distinct?

None of these questions are very close to completely answered, only one is really anywhere close, in fact one is known to be unanswerable in general, and partial answers to each of these have earned massive accolades from the mathematical community. No such phenomena can be observed in finite dimensional linear algebra, where we can classify in a matter of a few lectures all possible objects, give explicit algorithms to determine when two examples are isomorphic, determine precisely the structure of the spaces of maps between two vector spaces, and understand in great detail how to represent morphisms in various ways amenable to computation. Thus linear algebra becomes both the archetypal example of a completely successful mathematical field, and a powerful tool when other mathematical fields can be reduced to it.

This is an extended explanation of the claim that linear algebra is "thoroughly understood." That doesn't mean "totally understood," as you claim!

$\endgroup$
4
  • 2
    $\begingroup$ The Mortal Matrix problem (see my comment on D_S's answer) could be asked within a week of starting a Linear Algebra course, I think. $\endgroup$ Commented Mar 17, 2016 at 0:11
  • $\begingroup$ @RobertIsrael it's a good point. I was going for problems that are somehow intuitively very central to their field. Of course, the undecidability of homeomorphism between high-dimensional manifolds reduces to the word problem for groups, which is quite similar to the problem you cite. But perhaps the problem itself is still more natural and central...I don't know. $\endgroup$ Commented Mar 17, 2016 at 6:36
  • $\begingroup$ Is it really that linear algebra is more "well understood"? I feel that it's inherently "easier"? Like, I feel like it's possible to understand things that are difficult very well, and it's also possible to not understand things that are easy... no? $\endgroup$
    – user541686
    Commented Mar 19, 2016 at 7:20
  • 2
    $\begingroup$ It's not quite right to say that the isomorphism problem for objects is easy or solved in linear algebra. As the n-dimensional analogue of plane geometry, in linear algebra one would at least want to classify configurations of $k$ subspaces up to linear change of coordinates. I don't think I have seen the case $k=3$ written out. For $k=4$ things are definitely complicated, with continuous moduli. Relative position of two linear operators is another natural question that leads to notorious "wild" isomorphism problems. $\endgroup$
    – zyx
    Commented Mar 19, 2016 at 17:33
42
$\begingroup$

The hard parts of linear algebra have been given new and different names, such as representation theory, invariant theory, quantum mechanics, functional analysis, Markov chains, C*-algebras, numerical methods, commutative algebra, and K-theory. Those are full of mysteries and open problems.

What is left over as the "linear algebra" taught to students is a much smaller subject that was mostly finished a hundred years ago.

$\endgroup$
5
  • 2
    $\begingroup$ Also, Lie groups? $\endgroup$ Commented Mar 18, 2016 at 17:31
  • 2
    $\begingroup$ Certainly, but the intersection of Lie/algebraic group theory with linear algebra is usually called representation theory. $\endgroup$
    – zyx
    Commented Mar 18, 2016 at 23:21
  • $\begingroup$ @KevinCarlson, did you mean "is comm-alg really part of linear algebra"? Since collections of operators can be analyzed most effectively in the case when they commute, and the noncommutative theory has other names, I'd say yes. $\endgroup$
    – zyx
    Commented Mar 22, 2016 at 1:44
  • 1
    $\begingroup$ @zyx Yeah, that's right, thanks. But that only seems relevant to the case of commutative algebras of operators on a vector space. Commutative algebra studies arbitrary commutative rings, no? Or do you just mean that part of commutative algebra is a difficult area of linear algebra? $\endgroup$ Commented Mar 22, 2016 at 3:28
  • $\begingroup$ I meant that the commutative algebra immediately relevant to linear algebra would sometimes look like baby functional analysis, but would at other times be most naturally studied in the full generality of commutative rings and modules, categories of those, homological algebra in those categories, the whole machine. @KevinCarlson $\endgroup$
    – zyx
    Commented Mar 22, 2016 at 3:43
25
$\begingroup$

Journals such as "Linear Algebra and its Applications" still publish papers, so certainly not everything about Linear Algebra is known. It's definitely not exempt from Gödel. However, it is relatively well-understood compared to most other subject areas, and a "typical" (in some sense) not-overly-complicated question has, empirically, a fairly high probability of being answerable without too much difficulty.

$\endgroup$
17
$\begingroup$

This also depends a lot on what you actually need. As other answerers have pointed out, first of all you need to specify what linear algebra is all about. I define linear algebra as the study of finite dimensional vector spaces and their linear transformations, in particular it contains most of matrix theory.

As has been pointed out by many, many natural questions in linear algebra have been solved for many years. Classifications are often easy and can be explained in a few lectures - in particular the Jordan normal form, diagonalisability, perturbation theory are more or less completely understood. Furthermore, it is easy to construct an example and examine it with a computer (most of computers are good at is linear algebra). In this sense, the field has been "solved", because if we have a specific linear map, we can answer just about any question we have about it. The problems begin once we have infinitely many matrices.

However, when you want to apply linear algebra to other fields, the picture is very different. Most of the questions I encounter are hard and their solutions are difficult. Given a certain set of matrices that crops up in some physics problem, can I bound the second largest singular value? This might be very hard. Many fields in applied mathematics suffer from this problem - one very prominent example is signal reconstruction (compressed sensing was only discovered recently. Many questions in this field are answered using random matrix theory - but from what I gather that is only because linear algebra is not developed enough to deal with their questions).

A bit more concretely:

Consider for example the following very natural question:

Given two Hermitian matrices $A$ and $B$ with given spectrum. What are the possibilities for the spectrum of $A+B$?

This is known as Horn's problem and was only solved in this century (see also this MO question). It also took some of the smartest mathematicians to finally solve the problem.

Also consider the following problem:

Define a Hadamard matrix as a square $n\times n$ matrix with entries only $\{\pm 1\}$ and mutually orthogonal rows. In what dimensions does such a matrix exist?

Hadamard matrices have been studied for centuries - yet this question is still not completely solved. The problem may seem artificial at first, but it would have many interesting applications in information theory.

Here is a list of many open problems in matrix theory - many of which a lot of smart people have worked on: Open problems in matrix theory

It gets even worse if you also consider tensor products. While you could say that this is in principle "multilinear algebra", it can also be considered as a subset of matrix theory and would ordinarily be taught in linear algebra courses. Tensor products and questions such as the tensor rank are notoriously difficult - even computation only gets you so far as many problems are NP-complete or even undecidable in total (but might be very much decidable for interesting subclasses).

Another branch of mathematics that I would consider linear algbra concerns linear maps between matrices. Since $n\times n$ matrices are themselves a vector space, such maps can also be represented as matrices. This class is particularly important for quantum mechanics and contains a lot of hard questions.

Nevertheless, I do agree that we probably understand much more about linear algebra and we have a much wider variety of tools avaliable than in all other areas of mathematics.

$\endgroup$
2
  • 1
    $\begingroup$ You left out an important part of the definition of Hadamard matrix: the rows are mutually orthogonal. $\endgroup$ Commented Mar 18, 2016 at 19:38
  • 1
    $\begingroup$ @RobertIsrael: Of course! Thanks - otherwise the existence would of course be trivial... $\endgroup$
    – Martin
    Commented Mar 18, 2016 at 19:41
11
$\begingroup$

I'm the one who answered saying that "roughly speaking" (I'm quoting myself now) all we fully understand in mathematics it's linear and that most of the times is a common practice to reconduct mathematical objects and mathematical problems to linear one to solve them (I was thinking about Linear Representation, as much as the uses of Differentials, Tangent and Cotangent Spaces, etc in Differential Geometry, etc...).

I know I'm not saying anything new here. I was speaking generally in a broad sense pointing out the reason why Linear Algebra is so important and so involved in so many fields of mathematics, from Linear Representation, to Functional Analysis, etc..

Anyway since leftaroundabout wants to be specific and absolute I will also point out that the relation "$x=a$" which is something that everyone fully understand is a linear relation and in most of the cases represents the "solution" of the problem. Being the solution means that no further explication is needed, i.e. "fully understood". This to say that indeed what we fully understand is linear.

P.s. by the way to say that what we fully understand is linear is not the same of saying that we fully understand all of linear algebra so I don't get at all the need of bringing Gödel's Theorem in here (?) ...

$\endgroup$
8
$\begingroup$

When people are talking about linear algebra being a foundation for other areas of math, they don't mean a logical foundation. They mean the intuition, basic facts, and methods of proof constantly show up again and again in areas like functional analysis, field theory, algebraic groups, and a bajillion other places.

Godel's incompleteness theorem is irrelevant to most mathematicians' work. Undecidability is not something they encounter very much. Yes, one can probably find some statements in linear algebra which are undecidable in ZFC. No, they are probably not going to be very interesting statements. For most mathematicial disciplines, from applied math to representation theory to algebraic geometry, most of the natural questions you ask are going to have an answer that is provable in ZFC.

$\endgroup$
4
  • 13
    $\begingroup$ There are actually quite a few rather natural undecidable problems. Here's one in Linear Algebra. The mortal matrix problem: given a finite set of $n \times n$ matrices with integer entries, determine whether they can be multiplied in some order, possibly with repetition, to yield the zero matrix. This is known to be undecidable for a set of six or more $3 \times 3$ matrices, or a set of two $15 \times 15$ matrices. $\endgroup$ Commented Mar 16, 2016 at 23:56
  • 3
    $\begingroup$ Well this is not decidability in the sense that I was using it. You meant there is no algorithm which can produce a definitive yes or no answer. I meant statements which cannot be proved or disproved in ZFC. $\endgroup$
    – D_S
    Commented Mar 17, 2016 at 0:34
  • 2
    $\begingroup$ @RobertIsrael I'm pretty sure D_S means independent of ZFC, not non-computable/recursive. $\endgroup$
    – badcook
    Commented Mar 17, 2016 at 0:37
  • 17
    $\begingroup$ The two are connected. If for every instance the answer is provable in ZFC (and ZFC is $\omega$-consistent), then there would be an algorithm: search for a valid proof. The fact that there is no algorithm means either there is some instance where the answer is "no" but that fact is independent of ZFC, or there is some instance where the answer is "no" but there is a proof in ZFC that the answer is "yes" (note that if the answer is "yes", ZFC can verify that the answer is "yes"). $\endgroup$ Commented Mar 17, 2016 at 5:23
8
$\begingroup$

Linear algebra is basically the study of putting things on a grid. This has been the big driver for computers, starting with the Berlin Airlift linear optimization problems from 1948 (the first large-scale use of computers), to the ability to do pivots in Visicalc in 1979 (the first killer app), to the many special effects in games and movies, and recently the AlphaGo AI beat a top ranked player at Go. Linear algebra applications have been a chief driver of increasing computer power since the 1940's, and has been responsible for trillions of dollars in profit. This has also allowed linear algebra to be studied about, oh, 10^20 times more powerfully than branches of mathematics that don't synch well with computers.

There are many unsolved problems within linear algebra, but they tend to be the problems with no obvious profit angle.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .