6

I have a netbook that I want to connect an external monitor to. The netbook only has a VGA output. How much quality is lost due to using a VGA interface, as opposed to DVI one? Will the display be noticeably more "blurry"?

If so, can someone explain why this is? A pixel is still a pixel, so is the color information getting lost in the D->A then A->D conversion?

2

2 Answers 2

2

The resulting image quality depends a lot on the quality of the D2A conversion (typically called a RAMDAC) in your video card. Of course you can lose a bit in the A2D if you're plugging in a flat panel, but it probably will just be a slight reduction in clarity

I haven't seen any really bad VGA output in a number of years, however 10 years ago the results were often terrible. Companies would concentrate on being the 'fastest' card and the resulting image quality would be terrible - like setup a good monitor and aim for the highest resolution and refresh rates and you couldn't read text on the screen. Install a Matrox card and the image would be absolutely crisp.

On the D2A side, your image is a bunch of numbers - sharp lines are indicative of numbers that differ by quite a bit. So black text on a white page will be a very sudden transition between 0,0,0 and 255,255,255 (RGB values) and back again. Now try to convert this to analog - if the DAC and associated circuitry isn't well designed, you will end up not quite getting up to 255, 255, 255 on the pixel you want it at, and then not getting quite back to 0,0,0 for a few more. The result is a few more shades of gray rather then the crisp black and white you were looking for.

1
  • 1
    The degradation is also resolution dependent. The quality hit from going analog at 1600x1200 on my NEC 2090 is minimal. When I first connected my 3090 at 2560x1600 it defaulted to using the analog wires on the DVI cable and it looked so bad, giant smears of messed color (similar to a CRT that's had a magnet waved over the screen), my initial reaction was that the monitor was defective, switching to digital fixed everything. I was using a GTX260 so the card itself shouldn't've been an issue either. Commented Feb 24, 2010 at 15:49
1

It's not so much the two conversions (I'm assuming that the target monitor is not a CRT) as it's the wires. Analog signals are subject to interference: the wires can pick up signals from one another, passing mobile phones, radio signals, domestic 50Hz electricity, distant lightning, etc. There is also a certain amount of natural distortion which can never be entirely eliminated.

So a signal which goes in as "0.375V green" may come out as "0.419 green". The same is true of digital signals, but "0.123" is rounded to 0 and "0.859" is rounded to 1. A digital signal can recover entirely from small amounts of noise.

It's also the case that as a picture gets larger the frequency of the analogue signals gets higher. At this point you're even more dependant on the electrical properties of the cable and connectors to not lose or distort your signal.

You must log in to answer this question.