(I intended this to be a comment, but it got slightly too long--- so it is an answer.)
For any $m \times n$ matrix $A$, the following two conditions are equivalent:
- There is a $n \times 1$ vector $x$, with all entries nonzero, for which $Ax = 0$.
- Every column of $A$ can be written as a linear combination of the other columns of $A$.
Compare to the better known equivalence: there is a nonzero vector $x$ for which $Ax = 0$ if and only if the columns of $A$ are linearly independent (i.e., if and only if some column of $A$ is a linear combination of the others).
To prove the equivalence, fix an $m \times n$ matrix $A$, and denote its columns by $c_1, \dots, c_n$.
Suppose there is a column vector $x = (x_j)_{j=1}^n$ with all nonzero entries for which $Ax = 0$, and fix any $1 \leq k \leq n$. As the matrix product $Ax$ is $\sum_{j=1}^n x_j c_j$ by definition, we deduce that $\sum_{j=1}^n x_j c_j = 0$, and hence $x_k c_k = -\sum_{j \neq k} x_j c_j$, and hence $c_k = -\sum_{j \neq k} x_k^{-1} x_j c_j$ is a linear combination of the other columns of $A$. As $k$ was arbitrary, every column of $A$ is a linear combination of the other columns of $A$.
Conversely, suppose that every column of $A$ is a linear combination of the other columns of $A$. It follows that for each $1 \leq k \leq n$ there is a column vector $x^k = (x_j^k)_{j=1}^n$ with $x_k^k = 1$ and $A x^k = 0$. For each $1 \leq r \leq n$, let $H_r$ denote the set of all tuples $(\alpha_1,\dots,\alpha_n)$ in $\mathbb{R}^n$ for which the $r$th entry of the linear combination $\sum_{k=1}^n \alpha_k x^k$ is zero. The set $H_r$ is evidently a subspace of $\mathbb{R}^n$, and as $x_r^r = 1$, the $n$-tuple whose $r$th entry is $1$ and whose other entries are all $0$ is not in $H_r$. So for all $1 \leq r \leq n$, the set $H_r$ is in fact a proper subspace of $\mathbb{R}^n$. Since no vector space over an infinite field is a finite union of proper subspaces, there must be some tuple $(a_1, \dots, a_n)$ in $\mathbb{R}^n$ that is not in $H_1 \cup \dots \cup H_n$. It follows that all entries of the column vector $x = \sum_{k=1}^n a_k x^k$ are nonzero, and clearly $Ax = 0$.
End of proof of equivalence.
This gives us at least some alternative perspective on why your matrix has the property that it has. (A short bit of thinking shows that if its first column is a linear combination of the other two, then $c = 0$ is forced, but that in this situation the third column is not a linear combination of the other two unless $a = b = 0$).
[Added after the edit of the question]
Seeing another example of the pattern helped me a lot. I don't think that pattern has a name, but I can answer the question about that $X$ in a way that might indicate a method that works in the general case. The answer: it is true if $X$ has the property that there is a vector $y$ with all entries nonzero with $Xy = 0$, then $X = 0$.
Here is a proof: suppose that $Xy = 0$ and that $y$ has no nonzero entries. Denoting the entries of $y$ by $y_1,\dots,y_5$, from computing the matrix product $Xy$ we see that $Xy = 0$ is equivalent to the system of equations
\begin{align}
x_2 y_2 + x_1 y_5 & = 0, \\
x_2 y_1 + x_3 y_3 & = 0, \\
x_3 y_2 + x_4 y_4 & = 0, \\
x_4 y_3 + x_5 y_5 & = 0, \\
x_1 y_1 + x_5 y_4 & = 0,
\end{align}
which is, in turn, equivalent to the matrix equation $Mx = 0$, where $M$ is the matrix
$$
M = \begin{pmatrix} y_5 & y_2 & 0 & 0 & 0 \\ 0 & y_1 & y_3 & 0 & 0 \\ 0 & 0 & y_2 & y_4 & 0 \\ 0 & 0 & 0 & y_3 & y_5 \\ y_1 & 0 & 0 & 0 & y_4 \end{pmatrix}
$$
and $x$ is the column vector with $x_1, \dots, x_5$ as its entries.
A short computation shows that the determinant of the matrix $M$ is $2 y_1 y_2 y_3 y_4 y_5$, which is nonzero by the hypothesis on $y$, and hence $M$ is invertible. So from $Mx = 0$ we deduce that $x = 0$, which implies that your original matrix $X$ is the zero matrix. End of proof.
This proof method probably generalizes to other matrices $X$ of the type you want. To outline the strategy: given a matrix equation $Xy = 0$ with all entries of $y$ nonzero, the rewrite $Xy = 0$ as an equivalent matrix equation $Mx = 0$, where $x$ is a column vector with $x_1, \dots, x_n$ as its entries, and $M$ is an $n \times n$ matrix whose entries are either zeros or $y_1,\dots,y_n$ (the "pattern" of this matrix is determined by the "pattern" of the $X$ matrix in a way that I can't generally describe in a simple way at the moment, but some simple description is probably possible). It seems likely to me that the "pattern" of entries in the $X$ matrix will imply, in some generality, that the corresponding "pattern" of $y_j$'s in the $M$ matrix leads to a matrix $M$ whose determinant is a multiple of $y_1 y_2 y_3 y_4 y_5$, and hence to an invertible $M$ whenever all entries of $y$ are are nonzero.
That's hazy on generality, of course, but it is a concrete guess to work on, anyway.