0

They are the reverse. The monitor is connected through VGA, the TV through HDMI. The computer is running Windows 10 and has an Intel graphics driver.

When I press Win+P these options are shown:

  • PC screen only
  • Duplicate
  • Extend
  • Second screen only

Duplicate and Extend work fine, and so do PC screen only and Second screen only, but those two options are reversed.

In Display Settings, the PC screen is shown as "2" and the TV as "1" even though the PC screen is set to be my primary monitor.

I don't understand this and couldn't find anything to fix the issue. Please help!

3 Answers 3

2

Mac and Windows computers have two different display modes connected to an external display: Mirror and Extend. In Mirror Mode, the computer monitor duplicates the monitor on the external display so that you see the same picture on both the desktop and projector. In Extend Mode, the external display is treated as a separate screen so that you can have different windows open on the projector and desktop. It is easy to switch between the two settings.

0

When you're connected to your TV, if you right-click your desktop and click Display settings, do you see something like this:

display settings

with Make this my main display selected? That should be enough to set your PC as your first display.

(This is assuming that your PC screen is #1 and it's highlighted in blue when you set your main display. To test this, just click Identify, and your PC screen should show a giant 1 for a few seconds. If it shows a 2, just click the 2 box in Display settings and set that one to be your main display.)

1
  • In my setup I have a monitor connected through HDMI and a TV through DVI, both from my nvidia graphics card. And I got the exact same problem as OP, even though my monitor is set as Main Display as said and as Primary monitor in Nvidia Settings as well.
    – greatbard
    Commented Mar 27, 2019 at 22:36
0

The video-driver and Windows itself are hardcoded to assume that you have the monitor connected to the primary/best quality output of the video-card (HDMI) and that you would connect a TV only as a secondary device to another output.
This assumption is quite old and goes back to the days that TV didn't have HD displays and usually where connected through VGA.

Of course modern TV's are Full HD or even better and HDMI inputs on TV's are the norm now, but the old logic is still hardcoded in Windows and/or the video-drivers.

There is no solution to this problem, except physically swapping your displays so real life matches the assumptions build in the Windows.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .