0

This is a bit of an odd one as most problems are normally "windows cannot detect my monitor", however in this case it is detecting it when I dont think it should.

So I have 2 monitors a pc and a laptop, so connection wise its:

Monitor 1
| - PC DisplayPort 1

Monitor 2
| - PC DisplayPort 1
| - Laptop HDMI 1

If the laptop is nowhere to be seen both monitors work correctly being Monitor 1 and Monitor 2 on the PC. However the problem is introduced when we have both the laptop and the PC on at the same time, as then the laptop correctly takes Monitor 2, but the PC incorrectly thinks it is using both monitors.

I can see why its doing this as technically both monitors are on and the PC has a connection to both, however Monitor 2 has a source button where I can tell it where it should be taking its source from, so when the laptop is on I just tell it to use HDMI 1 but I was hoping windows would be clever enough to somehow know that the monitor is not actually using the signals it sends it.

So is there a way to get windows to factor in the source of the monitor?

3
  • 1
    No, Windows isn't that smart. And no, there's no work around.
    – user931000
    Commented Dec 9, 2019 at 13:15
  • 1
    It's working as designed, in fact. I use a similar system here & if it ever glitches & fails to see the 2nd monitor it totally messes up my workflow:\
    – Tetsujin
    Commented Dec 9, 2019 at 13:21
  • So I can give the answer a reward can someone just put up that windows works this way and there is no work around and I will give the answer @GabrielaGarcia you had it first. Currently im just telling the PC to just use monitor 1 manually but was hoping there would be some form of workaround :(
    – Grofit
    Commented Dec 9, 2019 at 13:43

0

You must log in to answer this question.

Browse other questions tagged .