1

I use a Samsung SyncMaster 2494 monitor at my desk. It has a VGA port, and I've been using an adapter that converts the VGA signal to HDMI so I can use it with my laptop. However, I took a look in the back of the monitor and found a DVI-D port. I'm thinking about getting a HDMI-DVI cable so that the signal is fully digital, as opposed to starting as analog then being converted to digital.

Would using the digital port of my monitor improve the image quality?

2 Answers 2

0

Yes, you should see an increase in display quality. DVI is newer and offers better, sharper display compared to VGA. Make sure your adapter cable is good quality.

Quote from the article below: "DVI offers a higher quality signal compared to VGA. The difference is especially noticeable at higher resolutions. The video quality is a factor of the mechanism of operation and the length and quality of the cable"

https://www.diffen.com/difference/DVI_vs_VGA#:~:targetText=The%20main%20difference%20between%20VGA,sharper%20display%20compared%20to%20VGA.

2
  • Whilst this may theoretically answer the question, it would be preferable to include the essential parts of the answer here, and provide the link for reference.
    – Mokubai
    Commented Nov 29, 2019 at 8:27
  • I edited the post to accommodate your suggestion
    – anon
    Commented Nov 29, 2019 at 12:56
2

You should always prefer a digital connection over an analogue one.

VGA signals will (on modern monitors) result in a conversion of the signal from digital-to-analogue, to pass from your computer down the VGA cable, and then another conversion of analogue-to-digital for the LCD panel display.

Both these conversions result in a form of quantization noise.

In the D-to-A side the discrete digital signal gets approximated to an analogue level which may be affected by the preceding and following levels depending on how fast the signal has to change. This can result in overshoot or undershoot of the output signal and degraded quality. Accurate and fast D-to-A converters tend to be relatively expensive.

The A-to-D (display) side has the opposite problem, a continuous analogue signal must be approximated to discrete digital signals to interpret the colour value. This stage also inherits all the errors generated in the previous stage.

What basically happens to the signal can be seen in this graph from Wikipedia but there is a some "best guess" approximation when trying to work out what an analogue value represents when converting to digital.

enter image description here

By changing to a complete digital connection you remove these two conversion stages and as a result get a cleaner and accurate signal. The computer and monitor now speak the same language without having to convert to and from another format.

VGA cables also do not always shield all the cores from each other and crosstalk can occur. Basically a signal in one wire can generate a small magnetic field which can induce a signal in another wire and alter the signal going down that wire.

Crosstalk is a major problem in analogue circuits and it can mean the signal you put in at one end of the cable is not quite the same as what comes out. VGA is particularly prone to it in my experience. Digital connections are not immune to crosstalk, but the nature of the digital signals on the cables means that unless the interference is huge then crosstalk is essentially irrelevant.

As a result by your display will receive a much cleaner signal by using a digital connection. It will basically be as identical to the original as it is possible to be.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .