0

My pc has an Nvidia 1060 with a single HDMI cable plugged in. I've recently added an HDMI switcher and the cable now goes directly into the HDMI switcher. The switcher is quite simple with two outputs. A button switch on it to define if the signal should go to monitor 1 or monitor 2.

The problem is that my default monitor is ultra-wide (2560 x 1080) and the second monitor (a Sony TV) is standard 1080p. When switching to the TV I get a message stating that the signal is unsupported. Unplugging the cable from the TV and plugging it back in fixes the issue. I'm fairly confident this is because the PC/GPU is still set to 2560 x 1080 or because a fresh hdcp handshake has not taken place.

The same happens in reverse, when switching back to the monitor the resolution is 1080p and under the display settings (both in the standard Windows settings and Nvidia Control Panel), the Sony TV is still shown as the monitor.

Is there a way to force Windows to re-establish the connection? I've tried the built-in detect monitor function which has no effect. I'm curious if there is any software to make the GPU think the cable has been unplugged and then plugged back in without having to physically do it each time?

Thank you for any help/advice in advance.

3
  • Cheap switchers are only good for outputs with the same resolution and refresh rate. Commented Mar 14, 2023 at 14:53
  • Please edit your question to include the make and model of the Display and the make and model of the TV.
    – Blindspots
    Commented Mar 14, 2023 at 15:54
  • Try and set the monitor to 'disconnected' in the display settings first, switch the outputs, then reconnect/extend the display. If that works, the steps can be scripted
    – Cpt.Whale
    Commented Mar 14, 2023 at 18:34

0

You must log in to answer this question.

Browse other questions tagged .