This question is basically part of this one I had asked before, but a different topic so I am asking a new question (basically I had figured out most monitor brands recommend 60Hz refresh rate settings even if it is 75Hz monitor and I became curious about it).
I would like to mention two quotes from the accepted answer on that question:
75hz does work, and works well enough, but it can't match the 60hz experience.
The problem is that all modern displays are set to 60hz because the power AC/DC conversion also happens to 60hz or 50hz depending on where you are in the world. Because 60hz with vsync on on a powersystem with 60hz, manufacturers keep saying in their manual that 60hz is the most optimal resolution to use, and it is true.
I recently discussed with someone about this and they argued that since AC is already converted to DC, the 50Hz or 60Hz frequency of powersystem has nothing to do with refresh rate of monitor. There would be no performance issues if you use 75Hz monitor (given monitor is not faulty).
So I'm trying to figure out if I really misunderstood these two quotes or the powersystem's frequency still has role in performance of monitor's refresh rate.
Is 60Hz optimal refresh rate recommendation still dependent on frequency of power for modern monitors?
In other words, if you have 50Hz AC supply and 50Hz monitor (or 60Hz AC and 60Hz monitor), would they be performance wise more compatible than 50Hz/60Hz AC with 75Hz monitor?