64
$\begingroup$

Most modern day Cathode Ray Tube (CRT) televisions manufactured after the 1960s (after the introduction of NTSC and PAL standards) supported the circuit-based decoding of colored signals. It is well known that the new color standards was created to permit the new TV sets to be backwards compatible with old black and white broadcasts of the day (among also being religiously backwards compatible with numerous other legacy features). The new color standards added the color information on a higher carrier frequency (but at the same duration of the luminosity). The color information is synchronized after the beginning of each horizontal line and is known as the colorburst.

It would seem that when you feed noise into a television, the TV should create not only black and white noise but also color noise as there would be color information at each new horizontal line where each frame should be. But this is the not the case as all color TVs still make black and white noise!

Why is this the case?


Here is an example signal of a single horizontal scan.

An image to illustrate where the chrominance is stored and how it alters the color (chroma) of the black and white (luma) picture.

And here is the resulting picture if all horizontal scans are the same (you get bars!).

The resulting picture.

$\endgroup$
7
  • 3
    $\begingroup$ First thing first : The colorburst does not contain the color signal, it it used to keep the chrominance subcarrier synchronized $\endgroup$
    – Ben
    Commented Jan 15, 2018 at 2:45
  • 10
    $\begingroup$ Only guessing : but is is possible that since there is no colorburst, the chrominance PLL does not lock and does not attemp to extract the color signal hence there is only a noisy luminance signal... $\endgroup$
    – Ben
    Commented Jan 15, 2018 at 2:49
  • 6
    $\begingroup$ Older televisions without an effective "colour killer" circuit would indeed display multi-colour static. $\endgroup$
    – psmears
    Commented Jan 15, 2018 at 13:30
  • 4
    $\begingroup$ We had a TV that produced color noise. $\endgroup$
    – Joshua
    Commented Jan 16, 2018 at 16:37
  • 4
    $\begingroup$ I can confirm that an unlocked subcarrier PLL killed all the chroma decoding on all sets I worked on. As well as providing a lock source for the chroma subcarrier regeneration, the burst was also used to estimate an average amplitude for the chroma sidebands and, in PAL systems, to identify the liines on whuch the R-Y signal was inverted.. $\endgroup$ Commented Jan 17, 2018 at 0:19

4 Answers 4

68
$\begingroup$

The color burst is also an indicator that there is a color signal.

This is for compatibility with black and white signals. No color burst means B&W signal, so only decode the luminance signal (no croma).

No signal, no color burst, so the decoder falls back to B&W mode.

Same idea goes to FM stereo/mono. If there is no 19 kHz subcarrier present, then the FM demodulator falls back to mono.

$\endgroup$
27
$\begingroup$

In the absence of a valid color burst signal, the "color killer" circuit disables the color difference signals, otherwise you would indeed see colored noise. This is mainly intended for displaying weak signals in B/W without the colored noise.

One step further is to mute the entire signal, substitute stable sync signals, and display a blue or black field with a nice "no signal" message.

Noisy sync signals can cause damage to the line transistor, breakdown due to excessive voltage. TV engineers learn to hate noise. I once designed digital algorithms for cleaning up a line sync signal before it reaches the transistor.

$\endgroup$
4
$\begingroup$

In PAL, the colour information (chrominance or chroma) is modulated onto the black and white (luminance or luma) baseband signal. The chroma is at ~4.4MHz offset from DC and is about 1.3 MHz wide.

Assuming that your noise is centered around DC then, if it is less than ~3.5MHZ wide then it won't appear in the chroma spectrum and will only be in the luma. Hence you see the noise as appearing in black and white.

If you had noise that extended into the chroma spectrum then you would see colour noise as well.

It gets more interesting because the luma and chroma spectra overlap. Very sharp details in the luma appear as high frequencies and are therefore part of the chroma spectrum. They therefore appear as colour noise in the picture. The luma signal is therefore often filtered to remove these high frequency components so that you do not get the noise on the chroma when you decode. You have the same in reverse with chroma appearing as part of the luma signal when decoded. Look up "cross-luminance" and "cross-chrominance" if you are curious and fancy a maths challenge...

This is all rapidly becoming less of an issue, however, as the older analogue transmission standards are being discontinued in the move to all digital transmission. The same issues do not exist in MPEG and similar standards (although they bring in all new and exciting issues of their own).

$\endgroup$
1
  • 1
    $\begingroup$ 'The chroma is at ~4.4MHz' :443361875 +/- 0.75 Hz. $\endgroup$ Commented Jan 17, 2018 at 0:24
2
$\begingroup$

To add to the existing answers, PAL corrects for color errors by reversing the color component of each line in the next one, which cancels out color errors and should also reduce the color component of random noise. While this effectively halves the color resolution, it has more lines to start with than NTSC and the eye doesn't have high of a color resolution either so it's not as noticable. Because of this I don't know if PAL even has a "color killer circuit". However on cheap PAL implementations that don't follow the standard/license this may be different. Source.

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.