3

The IBM 5153, the color monitor accompanying the original IBM PC, was designed to work with the CGA graphics card, vertical resolution 200 scan lines. This was quite unsurprising and straightforward.

However, as I understand it, the horizontal and vertical frequency conformed to NTSC (presumably in part to tap existing economies of scale on relevant electronic components, and in part to support the ability of CGA to output to a TV set, which looked necessary at the time though as far as I know, nobody ever ended up using).

NTSC actually provides for vertical resolution up to 240 scan lines (noninterlaced); it's only thought of as 200 because that's the title safe area; many old TV sets obscured much of the rest of the picture. But presumably if you were designing a monitor for the purpose, you could provide 240 visible scan lines. And as I understand it, this would not make the engineering any more difficult, because in each case the output of scan lines has the same frequency; you're just using more of each frame.

If the above is correct and it would really have cost nothing extra to provide 240 scan lines, why did the 5153 only provide 200?

Was it because CGA was designed to stay within the title safe area of a TV set?

Was it because it would have bumped the video RAM over 16K, thereby requiring the use of more RAM chips and making the card more expensive?

Was it just because people were used to 200, and nobody saw a need for more?

Or was there some other reason?

1
  • 1
    The ability to display on a TV wasn't the only reason that CGA included a composite output, you could also use it with standard composite monitors, video recorders and video projection systems. While composite output might have ended up being used less than IBM imagined, it was still used quite a bit and in any event using 240 scan lines wouldn't have been seen as a viable option until years later when it became evident (in retrospect) that the composite connector was little-used. Basically 240 scan lines wasn't an option because they couldn't see the future.
    – Ken Gober
    Commented Jun 25, 2018 at 14:29

2 Answers 2

9

TLDR: It's a soft spot for optimization around the ability to display 25 lines of text.

(And why this is important has been discussed some time ago in an answer to your question about why 80x25 became standard)


Preface: As usual with such decisions there are many factors involved - and most of them are not hard but variable within a certain frame and in relation to others. So there might be more details than covered here.


For CGA one goal was the ability to display 25 by 40/80 characters. With 200 this allows 8 scan lines per character cell. Something that nicely fits with binary counters addressing character ROMs, as well as RAM and ROM size and/or programs calculating screen addresses.

It's the same why the Apple II got 192 lines as Woz was targeting a 24 line text display. IBM in contrast needed to go for 25 lines to make it possible to have their mainframe screens displayed - the 3270 did offer 24 lines of user display plus one line of status information (*1).

Offering more than 25 lines of text (like 30 with 240 scan lines) would make programs converted form common 24/25 line displays problematic, as it's not always easy to support the additional screen estate. Where a Text processor may benefit, a 3270 session would look odd - at least.

While adding more text lines is problematic from a compatibility side, adding more scan lines per text line would have been doable. Like in 9 at 225 lines or 10 at 250 (*2). While allowing greater details on screen, it would require a more complex hardware (*3), increasing the cost. In addition it would break the nice 8 by 8 character cell used. Something that again simplifies hardware and programming.

While extending a scan line to 720 pixels might have been possible as well (like with more lines), it would again require more complex hardware - not at least even bigger character ROMs - much like with the lines.

Another way could have been adding a default blank scan line each 8 lines. While this may have improved text display (a bit) it would have made continuous graphics impossible.

With different goals (no graphics, only text) IBM's MDA did use a 720 by 350 with 9 by 14 pixel per character on quite similar hardware. Just here focus was on a clear text only display.

So bottom line: 640/320 by 200 is a soft spot for optimizing a display around text and graphics within the capabilities of displays based on TV components.


*1 - This 24 line user area of the 3270 is also the reason why Woz (and many other) did go for a 24 line text display.

*2 - 250 is still way within the ability of standard components. After all, CGA was meant to be displayed with a special made screen - just using standard (aka cheap) components.

*3 - Counting to 9 or 10 instead of 8 requires more complex counters and comparators unlike the easy binary power of 2^3 where just a counter is needed - similar in software it would require a multiplication, while the value 8 just requires a shift operation. While the used 6845 can handle this, ROMs need to be larger - depending on the hardware design maybe even double the size to just accommodate a single additional scan line per character. Similar for an additional horizontal pixel per character, eventually resulting in quadrupled ROM size. And in full graphics mode addressing gets really weird.

4
  • Good point, and I suppose you're right, though oddly, if you use a plain text mode video card, the character matrix is apparently 9 by 14. But that card and CGA were probably designed by different teams with different priorities.
    – rwallace
    Commented Jun 23, 2018 at 9:55
  • 1
    @rwallace They are quite similar in using a 6845 at core, so I wouldn't wonder if it was the same team, or two working close - but yes, for sure with different development goals (within the same framework)
    – Raffzahn
    Commented Jun 23, 2018 at 9:58
  • 4
    Using a nine-row font would require a larger font ROM chip than would be required for an 8-row font, but would not have affected the manufacturing cost of the CGA since it already used a ROM chip which contained not just an 8-row font, but a second unused 8-row font and the 14-row font used in the MDA (manufacturing one kind of 8Kx8 ROM chip and using it in both CGA and MDA was likely cheaper than manufacturing two different kinds of ROM, even though the latter ROMs would only need to be 2K or 4K each). From a clocking standpoint, the 6845 chip could handle any font dimension up to 16.
    – supercat
    Commented Jun 23, 2018 at 20:17
  • 2
    It's also worthwhile to note that the Tandy 1000 configures the display controller to display nine scan lines per row, shrinking the top and bottom overscan areas to compensate; Tandy's monitors have the vertical size adjusted down to ensure everything fits on screen. Replacing the 8Kx8 ROM with one holding different content and moving a couple address connections would make the CGA configurable for 8- or 9-row text with no extra components.
    – supercat
    Commented Jun 23, 2018 at 20:23
3

One thing I have seen is that with the reprogrammability of the 6845 CRTC on the CGA - and moreso its workalike on the EGA - you can indeed put more than 200 lines on the 5153 if you really want to. To make it work on CGA you either have to reduce the horizontal resolution, or use 40-col text mode, because of memory concerns, but it's doable. However it's not really worth it so much... you can fit maybe another extra couple of text lines, like one above and one below, 216 scanlines, before starting to hit the bezel. Similarly you can push the horizontal rez somewhat, with the same caveats, though there's a little more rez to be had (maybe in the same proportion?) that way vs vertically. And of course with the later adapter you can do both. It's such a meaningless change, though, and potentially incompatible, that hardly anyone bothered ever using that extra space in anger, other than maybe the makers of certain upgrade graphics cards... and then only really in the service of emulating higher-rez text or graphics standards (MDA, EGA-hi, Herc...) with less compression/line dropping/no interlace, or offering mildly expanded abilities, mostly in pure textmode (e.g. the much coveted 132-column mode, which I doubt was very readable on the heinously coarse-pitched 5153, and maybe 27- or 28-line mode... or both). And then only when it wasn't about as cheap - and value for money - to just get a better class of adapter and matching monitor (or multisync) straight-off. What would you prefer... 700ish by 220ish on a CGA monitor, or 640x350 with a wider colour selection on an EGA? (Or the odd, but onetime common 752x410 on a multisync, probably using the same card that offered both of the other two modes)

At the end of the day it was a device built to do a fairly specific set of jobs as well as it could deliver them, no more no less, and it did them pretty well. To display the terminal-like 40/80 x 25 textmode of the CGA, and the matching graphics modes which happened to be the exact same pixel size due to a happy accident of what would fit into the largest amount of memory they could afford to cram onto the card, both financially and physically (there is barely a square millimetre of bare PCB on the original CGA, it is ram-packed with chips), and because of the pretty rudimentary nature of said card outside of its overpopulation of low density DRAMs (the CGA featureset is often decried for being crap, but it's about the most sophisticated thing they could build into the available space, given the lack of resources to make a proper ASIC, and the tight deadlines). It just needs to show 200 scanlines, each 44.7 microseconds long (which is 320 pixels at double the colourburst, or 640 pixels at twice that rate), making them about as large as is reasonably possible, with a certain small strip of coloured border all round... and that's it.

Wondering why it can't do more is a bit like wondering why a 1960s TV can't show 1080p hi-def or tune into more than a dozen or so channels. It's simply not the need it was built to address, and the hardware / signals to drive it to any greater capability didn't even exist yet, so it would have been fruitless. Outside of special graphics workstations, no-one else was making anything particularly higher resolution or with more video memory at the time. The 5152 was the real oddity, with its non-broadcast-TV-compatible, more high-end-terminal-like 350-line at 49Hz refresh rate and minimal blanking scan setup.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .