17

Early PCs generated RF signal, and later Composite video or S-video, to use a TV set as monitor. Why didn't color TVs of those days expose a analog RGB interface for direct connection from VCR/PC or any other local device? There must be some stage in the TV where chroma-luminance signal has already been decoded into RGB signal and can be hacked into without costing too much.

A not-so-retro modding example:

Link:

18
  • 36
    For the same reason there were no widespread facilities for recharging electric cars in the 1960s :-)
    – dave
    Commented Apr 2, 2021 at 10:51
  • 30
    It think it’s even worse than that. It would have never occurred to anybody in the ‘60s that someone would need to plug anything into a TV set, other than aerial antenna. I’m going to guess that if you posed in a speculative exercise the question “what would you need to change and why on a TV set for future use other than aerial”, the most likely answer would have been “direct control of the beam to convert it into a vector display, so you could run a personal RADAR and keep a lookout for ICBMs”. Commented Apr 2, 2021 at 11:25
  • 5
    @Thorbjørn Ravn Andersen: Computers didn't really become common in homes until later than that, maybe mid-90s, by which time computer displays (even old CRT ones) had much better image quality than TVs (which were limited by broadcast signal quality). Not all that familiar with VCRs, but I expect they were designed to output TV signals. So the answer is that by the time there was a reason for TV RGB input, no one wanted it.
    – jamesqf
    Commented Apr 2, 2021 at 16:38
  • 8
    @jamesqf Here in Scandinavia C64’s were very, very common in the mid-eighties. Commented Apr 2, 2021 at 17:49
  • 5
    @another-dave Well, there were, but they were limited to electric milk floats, but in the 1960s certainly there'd be at least one or two in every town or more based on the number of independent milk delivery companies. There was one down the road from my parents' house, for example.
    – Dai
    Commented Apr 3, 2021 at 0:57

5 Answers 5

65

When colour television broadcasts began (1960s, in the UK; perhaps a little earlier in North America?) there weren't any local devices that customers might want to use. Broadcast TV was the only source of images that any home user could imagine.

Adding extra circuitry to handle separated R, G, B and sync inputs (with appropriate protections against overload etc.) wouldn't be straightforward, and certainly not cheap when receivers were generally constructed from discrete components (including thermionic valves). I'm guessing you've never disassembled a first-generation colour TV receiver?

As unit price was an important competitive element, no manufacturer would waste resources providing a feature that no customer wanted.

RGB SCART and the like were developed only when devices existed (already using the available inputs) and there was some demand to create a higher-quality picture, avoiding the modulation process. And by that time, the target displays were built using transistors, moving to greater use of ICs than discrete components.

11
  • 5
    I read years ago of a low-end consumer electronics company where (supposedly) the owner would look at a prototype new TV put together by the engineering department and say "what's this? do we really need it?" and start pulling parts out - as long as the TV still generated a reasonable (not perfect) image, the parts stayed out - every component removed = (lower price + higher market share) or higher profits. Commented Apr 2, 2021 at 14:59
  • 6
    @manassehkatz-Moving2Codidact "Madman Muntz" was the character in question, first sub $100 black and white TV set.
    – Dan Mills
    Commented Apr 2, 2021 at 16:29
  • 13
    Many older sets used a hot-chassis design. This made it necessary for them to use an RF transformer on the antenna input, but let them eliminate at least one more-expensive power transformer. Adding a headphone jack to a hot-chassis design required an audio output transformer, but that was often needed in any case. Adding a composite or RGB input would have been dangerous unless the set used a floating ground, which would have required an extra power transformer.
    – supercat
    Commented Apr 2, 2021 at 17:29
  • 2
    @Dai: Television sets needed to be constructed so that any parts that were connected to mains or high voltages could not be touched during anything resembling normal operation. On many sets, the power cord was molded into the back of the case, which would then plug into the chassis, such that unless one had a "cheater cord" one couldn't connect mains power to the unit with the back removed. If RF inputs and headphone outputs were isolated with both a capacitor and transformer, no mains voltages would be exposed outside the unit. A difficulty with providing composite or RGB video input...
    – supercat
    Commented Apr 3, 2021 at 16:57
  • 3
    @Dai: I suspect the cheapest way to add an RGB or composite video input that was isolated from a TV set's chassis would probably be to modulate the signal onto a higher frequency carrier, pass that through a transformer, and then demodulate the result. Since television sets already have demodulation circuitry built into them [see where I'm doing with this...]?
    – supercat
    Commented Apr 3, 2021 at 17:05
23

Many TV designs up into the 1970s were so called live chassis designs, which used one leg of the mains input as a reference ground. This saved materials and weight - given some early color TVs used 200+ watts at 100% duty cycle, you would have needed a rather bulky and heavy transformer, given that PSMPS technology was not really mature for consumer devices at that time. Some sets used a small transformer to supply some low voltage circuitry, while going straight off the mains for other parts of the unit - but still having the common ground, even of the transformer supplied parts, directly connected to the mains.

An RGB input is a DC coupled input, unlike an RF input.

Most home electrical systems do not use polarized plugs, or the correct polarization of plugs and sockets cannot be relied upon sufficiently to use it as a safety feature.

A DC coupled input needs a DC coupled ground - which, in a live chassis design, has a 50% chance (with an unpolarized plug) to be at 120V/240V mains live potential....

Thus, a live chassis design CANNOT have any DC coupled inputs or outputs to random external devices*, unless complex isolation circuitry (which is not trivial for a wideband and DC coupled signal like RGB video) is used.

(There is hearsay that a significant amount of people got injured attempting to retrofit audio outputs, RGB or composite inputs etc. to live chassis TVs back in the day.)

*Actually, there were some live chassis RADIOS too, which sometimes had special connectors for turntables or microphones that were in themselves completely insulated.... This kind of design would be considered insane today.


And there is yet another reason. Some color TV designs did not, anywhere in the circuitry, decode the received signal into RGB either, instead taking advantage of multiple control inputs on a CRT (eg cathode input for low bandwith chroma difference signals, grids for high bandwidth luminance) to only compose the "final" picture output within the CRT itself.


Addendum: One might wonder how so much electronics was powered "straight off the mains". Keep in mind that TVs until the late 70s often used at least some vacuum tube circuitry - and vacuum tube circuitry works great off a +150...250VDC bus, which you can relatively easily create from mains input without the use of a transformer. And even in a fully semiconductor based design, quite some of the power hungry circuitry is used to drive the CRT and the EHT inverter (usually combined into one circuit with the horizontal output stage) - this is typically not low voltage circuitry either (which is the reason why vacuum tube circuits were used for this for a long time - high voltage transistors or thyristors were LESS economical to use then. Fascinatingly, there were production color TV designs that use ICs and vacuum tubes together in one chassis.)

7
  • Could one have designed a set to drive the cathodes from phase-shifted chroma signals, and simply feed the received video to the grid, without bothering to demodulate chroma anywhere except in the tube? Picture quality would probably not be very good, but the amount of circuitry for that set could probably be slashed to half what would be required to properly demodulate a color signal into Y, U, and V components.
    – supercat
    Commented Apr 5, 2021 at 16:47
  • "so called live chassis designs" - any good descriptions of this? I have heard of this but still don't totally understand it. Commented Apr 6, 2021 at 14:36
  • @supercat I think that is more or less what some designs did :) Commented Apr 7, 2021 at 3:45
  • @MauryMarkowitz In non-technical terms: The whole damn thing, everything in it, is directly connected to wall power. Touch anything metallic that is part of the circuit - including a connector - and get bit. Commented Apr 7, 2021 at 3:47
  • 1
    @supercat for audio, yes. An isolation amplifier for RGB video is in a very different league. One way to do it would be to use a carrier wave ... oh wait, that is what using an RF modulator essentially does :) Commented Apr 7, 2021 at 18:43
23

Early colour TVs predated VCRs and home computers by many years. Even if it did not cost much, adding an RGB input would still be a cost for something that no one would use. However, it would have been more complex and expensive than you might expect today.

6
  • 7
    Toby's answer covers this and more.
    – badjohn
    Commented Apr 2, 2021 at 12:24
  • 3
    More isn't always better. Commented Apr 2, 2021 at 18:43
  • 1
    Adding video baseband inputs to a hot-chassis television set would have been expensive, requiring either a more expensive floating-chassis design or else modulating the signal to allow it to be fed through a narrow-band transformer. The cost savings for hot-chassis designs have gone down with time, making them rare nowadays outside small electronic devices like those found in kitchen appliances, wireless remote light switches, etc. but in the 1970s they would have been quite significant.
    – supercat
    Commented Apr 3, 2021 at 20:40
  • @supercat Indeed. I should have, even if it did not cost much. I'll edit it later when I have a more capable device.
    – badjohn
    Commented Apr 3, 2021 at 21:12
  • @badjohn While Toby Speight's answer does more expansively cover the same issue, it was posted after this one. It's normally not considered negative to this answer to have one which is posted later which states basically the same thing. In the generic case, it's possible that a subsequent answer could cover similar points and in the process show that the earlier answer completely missed something (which could be a negative). That is, however, not the case here, IMO.
    – Makyen
    Commented Apr 4, 2021 at 22:14
11

TV manufacturers didn't have a single, obvious RGB connection standard to implement. Physically, there was SCART (with competing European and Japanese pinouts), RCA, DE-9, and various manufacturer-specific DIN plugs to choose from.

Then you have the various electrical signals to send over them such as RGBS, RGsB, RGBHV, YPrPb, digital RGBI, etc.

And VCRs didn't even need RGB because Y/C was good enough (in fact, I think Y/C is good enough for early computers, also), and if you're recording off the air, the quality wasn't that great to begin with. (Prerecorded videocassettes didn't become affordable until later.)

While this was all being sorted out, device manufacturers typically provided an RF Modulator that could work with any TV through the antenna connector that every television already equipped (for example, 300 ohm twin lead screw connectors or a coax connector of some sort). So for the TV manufacturers, their job was already done. Providing more connectors for external devices quickly hits diminishing returns.

17
  • Screw connectors? I only remember push connectors. I recently disconnected my mother's TV, the aerial was just a push connector.
    – badjohn
    Commented Apr 2, 2021 at 21:36
  • @badjohn Push connectors are cheaper, but prone to unintended disconnection or shifting. Real installers use connectors with screw-on collars :)
    – Armand
    Commented Apr 2, 2021 at 21:46
  • Indeed, screw connectors are better but I've not seen them on old analogue TVs
    – badjohn
    Commented Apr 2, 2021 at 21:50
  • 1
    @badjohn Here in North America, we started on the 300 ohm twin-lead stuff and migrated incrementally to 75 ohm coax with F connectors. If my parents had bought a TV from the "black plastic housing" era or newer, it'd have only had an F connector. We still use F connectors for TV cable here and I remember reading somewhere that, at the frequencies we use, F connectors offer superior noise rejection to those PAL-region ones you're familiar with... judging by the Wikipedia F connector page, it may be the screw-on vs. push-on thing.
    – ssokolow
    Commented Apr 3, 2021 at 12:08
  • 1
    @snips-n-snails As far back as I can remember, so middle 1960s, TV aerials have used the same type of connectors. The quality has varied from ones that fall apart if you breath on them or which won't make a decent connection to pretty solid ones with good connections. However, the format has remained the same. I could take an early 1960s TV and connect it to a recently installed aerial feed (I wouldn't get a picture since there are no analogue signals, especially not in the VHF band). Conversely, I could connect a modern TV to an aerial installed in the 60s and I might get a picture.
    – badjohn
    Commented Apr 3, 2021 at 16:51
6

The question mentions “early PCs” that generated a TV-compatible RF signal and “the color TVs of those days”. This would be a period spanning from mid-to-late 1970s to mid-1980s. The computer systems in question would be microcomputers targeted at the home market.

By the late 1970s, new TV sets were already transistor-based and IC-based designs, instead of tube-based designs with a hot chassis. Baseband CVBS and audio inputs (bypassing the RF tuner) had started appearing on some select models. These were orignally meant for connecting a home VCR — a novel thing which was just beginning to be affordable and commonplace. But by the early 1980s, the early home computers, early home video cameras, and early video game consoles were also starting to utilize and drive up the demand for baseband inputs.

In the European market, the go-to baseband AV connector was, at first, often some variant of the round, multi-pin DIN connector, such as the one on this 1978 Grundig Super Color 8642. But this was a relatively short period. Due to an alleged French attempt at protectionism, European TV sets soon started standardizing on the larger, rectangular, multi-pin SCART connector, invented by the French.

Since the SCART connector specified, in addition to CVBS, RGB inputs (with an overlay capability, no less!), by mid-to-late 1980s, many European TV sets sporting a SCART connector effectively doubled as 15kHz RGB monitors through their SCART connector.

There were exceptions, of course. The cheaper, portable TVs still often only connected the CVBS and audio pins to their SCART connectors, leaving the RGB pins inoperational. And even many larger TVs — often equipped with multiple SCART inputs — commonly only had the RGB capability implemented on their primary SCART connector.

In the 1980s, there were also a lot of older, vacuum-tube-based sets from the previous decades still in use with no connectors for external devices, except for the RF signal input.

For such reasons, every manufacturer aiming to reach the homes and interface with the installed base had to provide an RF modulator, at least as the lowest common denominator option, and design their system around TV-compatible 15kHz timings.

RGB-capable SCART connectors also found their way on actual (15kHz, “CGA-level”) computer monitors. Popular European examples of such monitors include the Philips CM8833, the Commodore 1081, and the Commodore 1084, all of which could be used both as an RGB computer display and as a dedicated (baseband) video signals monitor for purposes such as video editing or monitoring a CCTV system. (Pro video people would use yet higher-quality video monitors with more broadcast-oriented features, such as Sonys or Ikegamis, but these entry-level monitors where good enough for security and prosumer/videographer purposes.)

One of the things that might have contributed to making RGB inputs a “natural thing” in Europe was the popularity of the Teletext system. By the end of the 1980s, a Teletext decoder (which includes a built-in RGB character generator that can sync to an external video source and superimpose the generated text/graphics on the live video) had become a standard feature on the European sets. Supporting such chip in the design is only a small step away from providing external RGB inputs. Then again, American TV sets had built-in closed-caption decoders which employed similar CG technology — and around this time, TVs also started getting crude on-screen menus which (I believe) often initially used the Teletext or CC CG chip for their video output.

Be that as it might, due to the NIH syndrome and other market-related factors (SCART RGB was basically forced on the European manufacturers by the French but North America did not have similar regulation or market pressure), American SD/CRT TV sets never got RGB inputs as a standard feature.

However, even non-European manufacturers were finally forced to add something functionally equivalent to RGB on their TVs, in the form of component (Y′Pb′Pr) inputs. This was due to the introduction of the DVD standard. DVD players required a better signal type than composite video (CVBS) or S-video (Y/C) to make the improvement in image quality they provided discernible.

European DVD players, of course, did not use component (Y′Pb′Pr) signals but had a SCART RGB connector on the back, for the best compatibility with the European TV sets. (Or rather, manufacturers usually supported both RGB and Y′Pb′Pr signals through the same pins so they could just ship the same PCB and case with a different back panel to different markets, and you could choose the output mode in the configuration menu.) Similarly, European game consoles (the fourth and fifth generation) often came with an RGB SCART cable, or had one available as an option in Europe whereas the American versions would have offered a component video cable in its place.

In conclusion, TV manufacturers added RGB signal inputs (or Y′Pb′Pr signal inputs, which is just another way of dividing the signal to three components and has comparable quality) when market demand or local regulations so required — not any sooner, and not any later. Europeans got a head start due to the French making it a legal requirement (which was a good thing from the perspective of a home computer hobbyist) but free market-driven development in other parts of the world only saw RGB-level signal quality a necessity on a domestic TV after the introduction of the DVD standard.

2
  • SCART was introduced in France in 1980, not mid to late 80s. All TV sets sold in France were required to feature a SCART connector. SCART enabled the introduction of Pay-TV with the creation of Canal+ in 1984, which required an external decoder to watch its programs (including p0rn). Commented Apr 6, 2021 at 9:08
  • 1
    By “mid-to-late 1980s” and “European TV sets”. I referred to new TV sets produced for the (Western bloc) European local markets in general, not only France. While the manufacturers eventually went “the French way”, SCART was never a legal requirement in other European countries.
    – Jukka Aho
    Commented Apr 11, 2021 at 12:17

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .