There are a few answers indicating a digital signal vs. analog which is correct. But that does not answer the why? A few mentioned translation layers, this is sorta true too, a mainstream A/D conversion can cause a loss in fidelity, but you'd have to measure this as it is hard to see the differences with the naked eye. A cheap conversion and all bets are off.
So why is digital better than analog?
An analog RGB signal (such as VGA) uses the amplitude of the signal (.7 Volts peak to peak in the case of VGA). This like all signals has noise which if large enough will cause the levels to be incorrectly translated.
Reflections in the cable (impedance mismatches) are actually the biggest downfall of an analog video signal. This introduces additional noise and gets worse with longer cables (or cheaper ones), the higher the resolution of the video also increases the signal to noise ratio. Interestingly, you should not be able to see any difference in a 800x600 signal unless the VGA cable is too long.
How does a digital signal avoid those pitfalls? Well for one the level is no longer relevant. Also DVI-D/HDMI uses a differential signal as well as error correction to assure the ones and zeros are faithfully transmitted correctly. There's also additional conditioning added to a digital signal that is not practical adding to an analog video signal.
Sorry for the soap box guys, but thems the facts.