1

This question is basically part of this one I had asked before, but a different topic so I am asking a new question (basically I had figured out most monitor brands recommend 60Hz refresh rate settings even if it is 75Hz monitor and I became curious about it).

I would like to mention two quotes from the accepted answer on that question:

75hz does work, and works well enough, but it can't match the 60hz experience.

The problem is that all modern displays are set to 60hz because the power AC/DC conversion also happens to 60hz or 50hz depending on where you are in the world. Because 60hz with vsync on on a powersystem with 60hz, manufacturers keep saying in their manual that 60hz is the most optimal resolution to use, and it is true.

I recently discussed with someone about this and they argued that since AC is already converted to DC, the 50Hz or 60Hz frequency of powersystem has nothing to do with refresh rate of monitor. There would be no performance issues if you use 75Hz monitor (given monitor is not faulty).

So I'm trying to figure out if I really misunderstood these two quotes or the powersystem's frequency still has role in performance of monitor's refresh rate.

Is 60Hz optimal refresh rate recommendation still dependent on frequency of power for modern monitors?

In other words, if you have 50Hz AC supply and 50Hz monitor (or 60Hz AC and 60Hz monitor), would they be performance wise more compatible than 50Hz/60Hz AC with 75Hz monitor?

4
  • Its not a matter of performance, but a matter of appearing better. The problem goes away if you match the refreshrate to the ac/dc frequency, or the refreshrate is at least double that.
    – LPChip
    Commented Nov 1, 2022 at 9:10
  • 2
    With CRTs, the refresh rate was literally the number of times per second the screen would light up. With LCD this is no longer relevant - the backlight has zero flicker [assuming basic competence at manufacture]. Refresh rate has become a buzzword in gaming. It's fairly pointless in anything else & I doubt the accuracy of gamers' perception to a game's responsiveness. As I'm typing here, my display is spinning its wheels at 60Hz, but my fps is idling at between 1 & 2.
    – Tetsujin
    Commented Nov 1, 2022 at 9:15
  • 1
    @LPChip - that hasn't been true since the 90s at least. Many CRT displays would run at non-multiples of the grid frequency. 75 or 85 was popular - you could see 60Hz out of the corner of your eye, so it was slightly disturbing.
    – Tetsujin
    Commented Nov 1, 2022 at 9:17
  • There's a decent 'quick guide' here - pcworld.com/article/547715/… I ran 'big' 21 or 22" dual displays since the mid 90's & only ever dropped to 60Hz when I started to sacrifice refresh rate for higher resolution. I was running mine at nearly 2k by the end & finally switched to 'true' 2k LCD in about 2012.
    – Tetsujin
    Commented Nov 1, 2022 at 9:25

4 Answers 4

3

because the power AC/DC conversion also happens to 60hz or 50hz

Not really though! Instead, it happens at some hundreds of Kilohertz. Everything PC technology is using switched-mode power supplies nowadays, because they are that much more efficient.

In the beginning, several of the common refresh rates or frame rates did indeed come from AC frequency. However, that was a looong time ago.


I never owned a flat panel display that could do 75 Hz. I doubt the manufacturer is recommending any particular refresh rate though.

2
  • 1
    So in short a monitor will give same performance whether voltage supply in my country is 50Hz or 60Hz?
    – Vikas
    Commented Nov 18, 2022 at 16:54
  • 1
    Yes, absolutely.
    – Daniel B
    Commented Nov 18, 2022 at 18:04
1

CRT monitor must have a precise refresh frequency, due to its nature, it constantly redraws the picture. But since VGA that frequency is generated by the video card!

I always lived in 50Hz world and never encountered 50Hz computer monitors, so I've never seen a PC CRT monitor that actually syncs to the line frequency. Probably in the past (EGA and earlier) they were like this, but monitors don't follow the line frequency for a quite long time and use Vsync signal from VGA to refresh instead.

In 1995 I've had a 14" CRT SVGA monitor that was able to sync at 1024x768 37.5Hz interlaced, which is half 75Hz and it was obviously generated by the VGA card clock. Other monitor I had around 2000 was 17", it was able to do 1024x768 at 85Hz (and that was extremely common); also it might do 1152x832 at 75Hz. Nothing like 60Hz, it never had such a recommendation. Since higher refresh rate lowered perceived flicker and eye fatigue, everybody wanted to have as much freq as possible. When LCD was introduced, it did not flicker at all. So was enough 60Hz (for the VGA signal, since there was no physical redraw at the panel) for it to be not eye-heavy. Everybody went 60Hz back, probably because the circuitry is cheaper when the frequency is lower, also the screen resolutions increased and there was always a tradeoff, if you want higher resolution, you can only achieve that at lower frequency.

With modern digital interfaces and various LCD panels, there is no such a strict requirement to have certain big refresh frequency at all.

Also, if you check the powering tract of the modern monitors, it is always switching power supply first, and the first thing SMPS does to the input voltage is to rectify it, then generate much higher frequency, to use smaller transformers. Then it transforms and rectifies it again. So there is nothing left from the input frequency.

Certainly not, the 60Hz recommendation may come from the legacy, but not from the actual dependency on the line frequency.

1
  • "certain big refresh frequency" you mean the frequency of voltage supply at home?
    – Vikas
    Commented Nov 18, 2022 at 16:52
1

Most modern monitors run on DC at least internally, and many CRTs ran at higher refresh rates, and ran on high voltage DC internally.

In theory I could run one of my monitors entirely off a suitable purely DC power source with sufficient power output, or off any modern power outlet at any voltage or frequency.

Basically your mains voltage and frequency shouldn't matter at all for a modern monitor.

The 60hz recommendation probably has a few reasons - 30 hz is the'minimum' for human eyes seeing smoothly according to many, with movies often shot at 24 hz. Higher refresh rates at higher resolutions mean the need for more bandwidth and video processing power. For example - many older UHD outputs would be at 30hz cause of data transfer requirements rather than anything else. Basically its a minimum that almost anything 'modern' can and should run

You might be confusing the TV standards (PAL at 50 and NTSC at 60hz) but this was more a matter of matching older standards and other things, and these analog transmission systems are mostly dead,

4
  • 2
    "and I don't think they had anything to do with power frequency either." Initially they had to do: en.wikipedia.org/wiki/Refresh_rate#Televisions
    – PierU
    Commented Nov 1, 2022 at 9:57
  • 1
    I stand corrected, and removed that.
    – Journeyman Geek
    Commented Nov 1, 2022 at 10:08
  • Could you please clarify what you mean by "internally" in first sentence?
    – Vikas
    Commented Nov 18, 2022 at 16:48
  • 1
    Well the components inside a monitor all run various DC voltages
    – Journeyman Geek
    Commented Nov 19, 2022 at 14:45
1

Initially, the 50Hz or 60Hz refresh rate was a design contraint for the first TVs:

The development of televisions in the 1930s was determined by a number of technical limitations. The AC power line frequency was used for the vertical refresh rate for two reasons. The first reason was that the television's vacuum tube was susceptible to interference from the unit's power supply, including residual ripple. This could cause drifting horizontal bars (hum bars). Using the same frequency reduced this, and made interference static on the screen and therefore less obtrusive. The second reason was that television studios would use AC lamps, filming at a different frequency would cause strobing.[7][8][9] Thus producers had little choice but to run sets at 60 Hz in America, and 50 Hz in Europe.

https://en.wikipedia.org/wiki/Refresh_rate#Televisions

Progress in electronics suppressed these contraints. Modern monitors can use any refresh frequency equally well. The higher the better in terms of motion rendering, as long as it does not exceed the specs of the graphic card and the video bandwidth.

However, when watching a video there is some interaction between the refresh rate and the framerate of the video: if the refresh rate is not a multiple of the framerate then this can produce jerky motion (this is subtle, yet perceptible). For instance VOD platforms in Europe generally stream at 25fps or 50fps, and I had to change the refresh rate on my PC from 60Hz to 50Hz to have a better rendering.

2
  • My TV switches to match source. My computer doesn't, but I don't use it for protracted watching [inc popcorn etc]. ;)
    – Tetsujin
    Commented Nov 1, 2022 at 10:06
  • @Tetsujin actually TVs and monitors do switch. It's just that in the case of a monitor (or even a TV used as a monitor) the source is the OS, not the video played inside the OS. Maybe there exist players that switch the OS setting according to what they are playing (at least when they go to full screen).
    – PierU
    Commented Nov 1, 2022 at 12:02

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .