This also depends a lot on what you actually need. As other answerers have pointed out, first of all you need to specify what linear algebra is all about. I define linear algebra as the study of finite dimensional vector spaces and their linear transformations, in particular it contains most of matrix theory.
As has been pointed out by many, many natural questions in linear algebra have been solved for many years. Classifications are often easy and can be explained in a few lectures - in particular the Jordan normal form, diagonalisability, perturbation theory are more or less completely understood. Furthermore, it is easy to construct an example and examine it with a computer (most of computers are good at is linear algebra). In this sense, the field has been "solved", because if we have a specific linear map, we can answer just about any question we have about it. The problems begin once we have infinitely many matrices.
However, when you want to apply linear algebra to other fields, the picture is very different. Most of the questions I encounter are hard and their solutions are difficult. Given a certain set of matrices that crops up in some physics problem, can I bound the second largest singular value? This might be very hard. Many fields in applied mathematics suffer from this problem - one very prominent example is signal reconstruction (compressed sensing was only discovered recently. Many questions in this field are answered using random matrix theory - but from what I gather that is only because linear algebra is not developed enough to deal with their questions).
A bit more concretely:
Consider for example the following very natural question:
Given two Hermitian matrices $A$ and $B$ with given spectrum. What are the possibilities for the spectrum of $A+B$?
This is known as Horn's problem and was only solved in this century (see also this MO question). It also took some of the smartest mathematicians to finally solve the problem.
Also consider the following problem:
Define a Hadamard matrix as a square $n\times n$ matrix with entries only $\{\pm 1\}$ and mutually orthogonal rows. In what dimensions does such a matrix exist?
Hadamard matrices have been studied for centuries - yet this question is still not completely solved. The problem may seem artificial at first, but it would have many interesting applications in information theory.
Here is a list of many open problems in matrix theory - many of which a lot of smart people have worked on: Open problems in matrix theory
It gets even worse if you also consider tensor products. While you could say that this is in principle "multilinear algebra", it can also be considered as a subset of matrix theory and would ordinarily be taught in linear algebra courses. Tensor products and questions such as the tensor rank are notoriously difficult - even computation only gets you so far as many problems are NP-complete or even undecidable in total (but might be very much decidable for interesting subclasses).
Another branch of mathematics that I would consider linear algbra concerns linear maps between matrices. Since $n\times n$ matrices are themselves a vector space, such maps can also be represented as matrices. This class is particularly important for quantum mechanics and contains a lot of hard questions.
Nevertheless, I do agree that we probably understand much more about linear algebra and we have a much wider variety of tools avaliable than in all other areas of mathematics.