0

I was looking at some monitors and I noticed most of them mentioned it like:

Refresh rate: 75Hz (Analog)

A particular monitor from Benq mentioned additional details like:

Maximum Refresh Rate: 75 Hz (Analog), 75 Hz (Digital) Maximum 3D Refresh Rate: 75 Hz

I could not find any good resource to understand it. So what's the difference between analog and digital refresh rates? Does it have something to do with the display cable you use? Like HDMI or Display Port?

3
  • What's the interfaces on the monitor?
    – Journeyman Geek
    Commented Aug 6, 2022 at 5:54
  • @JourneymanGeek those who mention only "Analog", they have VGA and HDMI ports. And this Benq specific have 3xHDMI ports only. No VGA.
    – Vikas
    Commented Aug 6, 2022 at 6:11
  • 1
    It is not an alalog / digital refresh rate. It is the refresh rate of alalog / digital. That is it depends (or not 75Hz vs 75Hz) on the connection. Commented Aug 6, 2022 at 7:44

1 Answer 1

3

Analogue would refer to the refresh rates available when using an analogue signal such as VGA.

Digital would be the refresh rates when using DVI, HDMI or DisplayPort as these are all digital signals.

They probably have a model of that monitor with VGA inputs but only made one generic datasheet for the entire range.

2
  • So apparently there won't be any performance difference whether they are analog or digital right? It's just about interfaces?
    – Vikas
    Commented Aug 6, 2022 at 7:29
  • 1
    Yes. They are saying that all interfaces support 75Hz. Analogue interfaces such as VGA do look worse than digital ones though.
    – Mokubai
    Commented Aug 6, 2022 at 7:31

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .