Mathematics is not the study of bits of ink on paper (or pixels on screens, indeed), it is the study of concepts and abstract ideas. Hence, when you look at some ink on a piece of paper, you have to first decide "does this correspond to an abstract idea?" before asking "what mathematical meaning does that idea contain?". Before you ask if $\frac{1}{0}=1$ is true or false, you need to ask what those symbols mean. Well, usually you don't need to ask, because it's obvious, but when you're unsure you ought to remember that just because you wrote down a thing, doesn't mean there's anything in it.
Hence I would argue that (unless you give meaning to it, and there is no "obvious" meaning in this case) $\frac{1}{0}=1$ is neither true nor false, because truth or falsity is a property of abstract mathematical concepts, and this pattern of pixels does not map to any such thing.
In programming terminology, I would describe it as a compile error, or a parse failure :)