I recently decided that I would connect a second monitor to my PC. I realised that my graphics card had a VGA and a DVI input and given the fact that both my monitors are equipped with VGA cables, I got myself a VGA to DVI adapter (VGA to DVI-I (Dual Link) to be more specific), however, while the monitor connected to the DVI port through the VGA to DVI adapter is getting detected, it does not display anything and acts as if there is no input.
Secondly, I am actually unsure whether my graphics card has a DVI-I (Dual Link) or DVI-D (Dual Link) input, considering the holes differentiating DVI-I from DVI-D look like they've been forced in there (can be seen on attached pictures).
Am I doing something wrong and can it be a software problem?
And if my input happens to be a DVI-D input, can a DVI-I stick work together with a DVI-D input?
What I HAVE tried:
- Updating Graphics Card drivers
- Restarting my PC
- Updating Graphics Card drivers
- Resetting both monitors to factory settings
- Switching between monitors plugged in through the VGA to DVI adapter (same effect, the monitor plugged in through the normal VGA port worked, the other got detected [even by name] but did not show any display.)
PC and Monitor specifications:
Operating system = Windows 7 Home Premium - Service Pack 1
Motherboard = MSI 760GM-P21(FX) (MS-7641) 3.0
Graphics = NVIDIA GeForce GTX 650 - GDDR5 - 1024 MBytes
Monitor 1 = HP L1925
Monitor 2 = PHILIPS 170S4