1

4K screens set to native resolution (3840x2160) with 100% font scaling has nearly unreadable fonts.

On my 4K monitor if I set font scaling to 100% then the font and everything else becomes small, as expected. However the font becomes nearly unreadable. It looks like the resolution being sent doesn't match the resolution of the screen. If I screenshot the screen content and zoom in I see perfect fonts.

I tried setting Nvidia display scaling to "No scaling", but no change. I've connected the screen via HDMI and running 3840x2160 at 30hz and 60Hz as forced setting ("Customize..." button in Nvidia Control Panel). 30Hz because apparently DisplayPort is required for 60Hz, but sometimes 60Hz seems to work too.

I have an Asus PB287Q 4K monitor, but I had the same problem on a Phillips Brilliance 288p 4K monitor. Nvidia GTX 560 graphics card. I also have a second monitor at 1920x1080 connected, but its the same result without it connected.

Anyone know how to fix this?

Screenshot sample 1:
Screenshot quality 1
Picture of same area (ignore distortions from resizing image of pixel grid):
Visible quality 1

4
  • 2
    That screenshot (not the photo) doesn't look like High DPI at all. Are you sure the software side is properly set up?
    – Daniel B
    Commented Jul 7, 2016 at 12:13
  • Ah, nevermind my earlier comment, just noticed you set it to 100% scaling intentionally. Your graphics card may not be able to keep up. You could try with a cheap modern graphics card to check if it can drive the display properly.
    – Daniel B
    Commented Jul 7, 2016 at 18:08
  • the 560 specs say it isnt going to even do 4k, for 1 monitor. you could set the res to 1/4 res of the monitor and get better interpolation, and that likely means only 1 monitor also.
    – Psycogeek
    Commented Jul 8, 2016 at 22:26
  • Upgraded graphics card + connected via DisplayPort and everything works fine. Commented Jul 9, 2016 at 11:10

2 Answers 2

1

Upgraded graphics card to one that supports 4K + used DisplayPort to connect. Works fine now.

0

Don't use a DVI-to-HDMI adapter at resolutions above 1920x1200.

The DVI port on your video card will attempt to use dual-link signalling, which is incompatible with HDMI.¹ Dual-link DVI carries pairs of pixels on different pins, but is designed to degrade "gracefully" in case of e.g. a wrong cable, resulting in half of those pixels being lost. (In contrast, HDMI achieves these resolutions with a faster pixel clock.)

Your best option is to use DisplayPort 1.2+ or HDMI 2.0+, whichever your setup supports (if any). Many older video cards just don't support 4K past 30Hz.

¹ Source: Personal experience plugging a Dell P2415Q into a GeForce 8600GT.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .