8
\$\begingroup\$

All around me are a bunch of electronics: digital clocks, my laptop, a refrigerator, a dimmable flashlight, and more. What they all have is perceptible flickering of their displays due to PWM, especially when I make quick eye movements (i.e. normal everyday vision).

I've played around with PWM and LEDs before; flickering becomes comfortably imperceptible at 1000Hz or so, which is trivial for a microcontroller, even perhaps inconsiderable.

I realize some devices may be governed by mains frequency, but as far as I know, a lot of my electronics use filtered DC power.

Why can't every LED display be designed so that no flickering occurs?
I can think of a few reasons for our current situation:

  • We have a bunch of lazy engineers
  • Cost reasons -- maybe they're using some absolute worst microcontroller to save a few pennies
  • Efficiency -- I know PWM is more efficient than constant current, and I guess the higher the frequency the closer it is to constant current (can I assume that?), but I'd be surprised if there was a major difference between 100Hz and 1000Hz.
  • I am literally the only person bothered by this.

Thoughts, anyone? I do hope I'm not the only one.

\$\endgroup\$
4
  • \$\begingroup\$ EMI - each edge produces EM noise and the fewer edges per second, the smaller the EMI generated. \$\endgroup\$
    – Andy aka
    Commented Aug 20, 2014 at 9:39
  • 1
    \$\begingroup\$ Sometimes what looks like PWM is Phase Angle Control (PA) of the mains or of AC stepped down from mains supply. This by it's nature is 50 / 60 Hz depending on location. \$\endgroup\$
    – Spoon
    Commented Aug 20, 2014 at 10:37
  • \$\begingroup\$ Isn't this off-topic because it is asking for opinion? I am sceptical that you are seeing 1000Hz signals, because I believe that is about 10x faster than I had previously believed human eyes are capable of. After all, we are merely electrochemical machines. I believe birds are faster, which makes sense, but AFAIK, they are way short of 1kHz. However, it may be some artefact of two different frequencies 'beating'. Another possibility is it is a symptom of a disease. I'd recommend getting in touch with a local eye specialist, or university medical department. \$\endgroup\$
    – gbulmer
    Commented Aug 20, 2014 at 11:35
  • 1
    \$\begingroup\$ @gbulmer What you're referring to is the flicker fusion threshold - the rate at which distinct images "fuse" and are perceived as smooth motion. Humans' threshold is somewhere around 60 Hz (but peripheral vision can detect flicker above this, which is why magnetic ballasts can cause headaches and eyestrain), whereas birds and dragonflies are known to be at least 100 Hz, perhaps a few hundred Hz. Ref1, Ref2 \$\endgroup\$
    – JYelton
    Commented Aug 20, 2014 at 16:27

4 Answers 4

5
\$\begingroup\$

It's not really PWM, rather it's multiplexing of displays. I won't go over the advantages of multiplexing in detail here, but it's not power efficiency, rather it's a reduction in cost and complexity of drive components. It's possible to use a few cheap parts to drive a 4-digit LED display (32 segments) with only 12 port pins (on a single sided PCB if necessary).

Most of this kind of product will be using an 8-bit processor rather than some. 32 bit thing, and usually at a relatively low clock frequency such as 4 or 8 MHz. They will not be equipped with a hardware display controller generally, so an ISR will do the work. If there are other things that are high priority then the display digit brightness might be visibly modulated in brightness due to jitter in multiplexing - some level of that would be deemed to be acceptable if not entirely imperceptible. Same thing with flicker in the display. Even so the micro might be spending more than 20% of its bandwidth just controlling the display. Faster clock would mean more power consumption of the micro, more EMI and more cost. For an 8 digit display muxed at 200Hz a new digit must be handled every 600usec or so, +/- 30 usec (that would be a pretty high quality display for an application without vibration). If there is a lot of vibration maybe 5x faster.

Although a designer could propose using, say, a small FPGA to totally eliminate all timing constraints and a 6-layer board to deal with the EMI, that would likely be their final act at a consumer product company. The attitude is that a 5 cent reduction in cost would be sufficient to hire another engineer.

Digital LED mains powered clocks are a special case, and some use a clever biplexing scheme powering the display from an unregulated centre-tapped transformer secondary, so the mux frequency is tied to the mains frequency.

\$\endgroup\$
2
\$\begingroup\$

It's lazy engineers, period. I find it particularly annoying with regard to car taillights at night. When I sweep my eyes across the traffic, the "strobe" effect drives me nuts.

I have designed multiplexed displays, and I can tell you that it doesn't cost any more to multiplex/PWM a display at 1000 Hz than it does at 50/60 Hz.

\$\endgroup\$
2
  • 1
    \$\begingroup\$ My experience designing PWM displays has been that if you're only controlling a few LEDs in a multiplex, you can have a relatively high refresh frequency. But with more LEDs to refresh (given the same microcontroller and clock) things can get noticeably worse. \$\endgroup\$
    – JYelton
    Commented Aug 20, 2014 at 16:35
  • \$\begingroup\$ Totally agree about the strobe effect at night, it's immensely distracting for at least some people with high sensitivity to flicker, myself included. \$\endgroup\$
    – pcdev
    Commented Apr 11, 2018 at 23:15
0
\$\begingroup\$

At a guess (as I've never designed a PWM'd display):

  • Cost: faster processors/IC's cost more
  • Cost: higher frequency circuits (boards, components) cost more
  • Cost: eliminating radio interference costs money, higher frequency = more likely to act like a transmitter and have unwanted side effects
  • Cost: developing high frequency circuits that behave properly & don't generate RFI requires more effort
  • Cost: Very few people notice or care that your alarm clock display flickers less, certainly won't pay more for that alarm clock.

Also cost. ;)

\$\endgroup\$
1
  • \$\begingroup\$ Also, on/off switching is not exact in timing. With longer pulses, the absolute error remains the same, so the relative error, which translates to a difference in perceived brightness, is reduced. \$\endgroup\$ Commented Aug 20, 2014 at 11:08
0
\$\begingroup\$
  1. We have a bunch of lazy microcontrollers :)
  2. Cost (anything said John U is true)
  3. I haven't found any visual difference with 100 Hz and higher frequencies (except there's stroboscopic effects involved: vibration, rotation etc.)
  4. In many cases embedded application uses repetitive timer with 50, 64, 100, 1000 Hz or so, used as main base for controlling time intervals. Often LCD refresh is also bound to this timer.
  5. Sometimes device is clocked by 32768 Hz crystal oscillator (and it is enough).
  6. Switching at audio frequencies can cause undesired continuous tone at speakers or some electronic parts such as inductors.
  7. Sometimes LEDs are driven by switching voltage convertors and brightness is controlled, by turning it on and off. As startup requires some time it cannot be switched too often.

There's great difference driving led variable continuous current or PWM. Changing instant current cause not only brightness, but also can change emission spectrum (especially with white LEDs) and efficiency factor. Also controlling brightness by PWM duty cycle is strictly linear.

\$\endgroup\$

Not the answer you're looking for? Browse other questions tagged or ask your own question.