0

This question is very related to the one I'm about to ask, but it's not quite the same, so it doesn't answer my doubt.

Is there a strict relationship between refresh rate (RR) and frame rate (FR)? I mean, is there any rule of thumb that links those two parameters? Such as "RR should never be greater than FR" (or viceversa), or "it's impossible to notice differences in different values of FR if they are grater than RR", things like those. I'm having trouble getting the difference between these two terms.

I'm no expert in the subject so any explanation in simple terms would be appreciated.

1 Answer 1

1

The refresh rate is the frequency that the monitor updates the pixels on its panel.

Frame Rate is how many frames are being rendered by the GPU and sent to the monitor for display.

There isn't a relationship between the two as such, other than the specific timing. If the GPU updates the image in the middle of the monitor updating the pixels, new information is added to only half the screen, causing a temporary 'tear' in the screen.

Ensuring that the timing is synchronized (so the GPU doesn't update while the monitor is drawing), the image won't tear. This is known as V-SYNC. When using VSync Framerate will be locked to the refresh rate.

2
  • So the only way not to get tears in the screen is setting RR = FR? I thought that, for a given monitor with a fixed RR, increasing the FPS of a game would improve user experience or, in the worst-case scenario, make no difference, but I never thought it could be detrimental. Could it?
    – Tendero
    Commented May 3, 2017 at 14:51
  • I think the issue is more complicated than that, as many monitors and GPUs support GSync, which I beleive is dynamic. My answer is fairly simplistic. You might get better, more detailed answered by redefining your question with a set end goal etc.
    – Stese
    Commented May 3, 2017 at 15:00

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .