35

Before the age of LCDs, PC displays almost always targeted 4:3 CRT displays. And indeed, VGA and most super-VGA modes had 4:3 aspect ratio resulting in square pixels - except the odd 5:4 1280×1024 mode. This mode even caused 17″/19″ LCD monitors to actually be 5:4, before widescreen 16:9 took over.

How did this odd resolution become sort-of standard? Edit: Why didn't they go with a corresponding 4:3 resolution instead, like 1280×960 or 1360×1020 (to fit in 4MB for 24-bit)?

To clarify, here are the resolutions as I remember them:

┌───────────────────┬──────────────────────┬──────────────┐
│                   │ Resolution           │ aspect ratio │
├───────────────────┼──────────────────────┼──────────────┤
│ (pre─VGA madness) │ various              │ various      │
│ VGA               │ 640×480              │ 4:3          │
│ Super VGA         │ 800×600              │ 4:3          │
│ Super VGA         │ 1024×768             │ 4:3          │
│ Super VGA         │ 1152×864             │ 4:3          │
│ Super VGA         │ 1280×1024 ◀◀◀◀◀      │ 5:4 ◀◀◀◀◀    │
│ Super VGA         │ 1600×1200            │ 4:3          │
│ Widescreen        │ 1920×1080 and others │ 16:9         │
└───────────────────┴──────────────────────┴──────────────┘
4
  • 2
    For reference regarding SuperVGA modes, see Ralf Brown’s table of resolutions supported by INT 0x10 on various cards, and the lists of modes supported in VBE. Commented Jul 3, 2019 at 10:11
  • 2
    actually, the weirdest thing is that we perceive 5:4 as weird :-D why they were all sticking to 4:3 when there are so many other options (unlike in TV, where all screens have to be 4:3 if the source material is 4:3)
    – szulat
    Commented Jul 4, 2019 at 12:42
  • 1
    Super-VGA is 800x600. 1024x768 is XGA. 1280x1024 is actually Super-XGA.
    – mirabilos
    Commented Aug 21, 2019 at 18:43
  • @mirabilos - most hardware marketed under the name "SVGA" was capable of at least 1024x768 and usually higher resolutions. The programming interface for all of these was identical. Terms like "XGA" may hae been used in some circles, but in my experience SVGA is what most people called the hardware used to provide these resolutions
    – occipita
    Commented Sep 12, 2020 at 22:13

5 Answers 5

6

The concept of 5:4 aspect ratio, and 1280x1024 graphics coordinates, is actually much older than you think.

It dates back at least to the BBC Micro introduced at the end of 1981; it had graphics modes designed around the PAL TVs used in the UK, with 160x256, 320x256, and 640x256 modes, all with a standard coordinate system of 1280x1024 for easy graphics programming. At 8x8 character size, this allowed an 80x32 text display on affordable hardware at home, better than many of the cheaper dumb terminals. The Acorn Archimedes, which succeeded the BBC Micro in the late 1980s, extended this capability to 640x512 with PAL TV timings, as well as supporting VGA/SVGA resolutions when connected to a PC-type monitor.

These resolutions were very easy to implement on PAL, using a 16MHz master dot clock, since the time between horizontal sync pulses is exactly 64µs, and there are slightly more than 512 lines (divided between the two interlaced fields) in the display-safe area. This relatively high level of capability was used by the BBC to generate broadcast TV graphics during the early to mid 1980s.

By 1984, early SGI IRIS workstations supported high-resolution graphics, with - in particular - 1024 rows of pixels:

The IRIS 1400 comes standard with 1.5 MB of CPU memory, 8 bit-planes of 1024x1024 image memory, and…

It's not immediately clear whether all of these could be displayed simultaneously in practice at the time. More likely, a section of the framebuffer could be selected for display output, the size of that section depending on the output device's capabilities.

Apple introduced what was then considered a very high-resolution monochrome display in 1989, supporting just 1152x870 resolution (in a 4:3 aspect ratio), a size most likely designed to just fit in a megabit of RAM. A special modification to the Acorn Archimedes series allowed it to support 1152x896 (close to 5:4 aspect ratio) on a particular monitor, probably very similar to the one made by Apple; the Archimedes allocated display memory in system DRAM, so it didn't have a hard megabit limit as a Mac's graphics card did.

As the availability of fast and affordable memory became less of a restriction on graphics capabilities in the 1990s, it is notable that 1280x1024 with a 5:4 aspect ratio was specifically catered for by high-end monitor vendors. If three bytes per pixel are used to support 24-bit truecolour, moreover, this is a resolution that fits comfortably in a comparatively affordable 4MB of VRAM. CRTs could easily be built this way, as the natural shape of a glass tube is circular, thus the squarer the aspect ratio the easier the CRT was to make. This also did not restrict the display from handling 4:3 aspect ratios cleanly, just leaving a slightly different pattern of blank borders at the edges. Once 1280x1024 was established, LCD monitors for computer use were made to support it (in contrast to those for televisions) and these are still available today.

The slightly taller aspect ratio is useful in text modes, where programmers appreciate having more lines of code on screen more than they do having more columns, and also in desktop environments where menus and toolbars have a habit of consuming vertical space more often than horizontal. The present trend towards wider aspect ratios, by contrast, is driven by the movie and gaming industries which want to cater for human peripheral vision, and thus immersion in the scene, rather than for display of information.

6
  • Very interesting! However, were there really 5:4 CRTs (from "high-end monitor vendors")? I remember 1280×1024 used on "regular" 4:3 CRTs, with the pixels slightly squished so they'd fit.
    – Jonathan
    Commented Sep 7, 2020 at 14:34
  • And why didn't those vendors support higher-res 4:3 resolutions that'd fir in 4MB 24-bit, like 1280×960 or 1360×1020?
    – Jonathan
    Commented Sep 7, 2020 at 14:35
  • @Jonathan There was at least a 17" Sony model that was designed to accommodate a 5:4 aspect ratio. As I noted in my answer, it's not terribly difficult to make them that way, and you can actually use 4:3 CRTs by just leaving a bit of space on the sides (as the BBC Micro did on TVs). And indeed, CRT monitors can be used with any resolution that fits within its refresh and signal bandwidth specs, but 1280x1024 was chosen as the standard - probably because it's a nice round pair of numbers in binary (10100000000x10000000000). Computers like binary numbers.
    – Chromatix
    Commented Sep 7, 2020 at 14:49
  • @Jonathan: I think the 1280 was chosen because it is conveniently equal to 16 times 80, meaning that one can show eighty 16-dot wide characters per line. The 1024 was chosen because it would allow a program to use a 1024x1024 area for plotting (in cases where power of two sizes are convenient) and have a 256-pixel-wide status area to the side.
    – supercat
    Commented Sep 7, 2020 at 18:44
  • I just measured the visible area on my Trinitron 300sf CRT, which can do 1280x1024 @ 75Hz (that's 80khz horizontal!). It's 38.5cm x 29.0cm which works out to an aspect ratio of 1.3275 -- almost exactly a 4:3 aspect. So on this display, 1280x1024 would have very slightly non-square pixels.
    – kiwidrew
    Commented Sep 12, 2020 at 2:30
29

VGA's 640x480 mode was the first to offer square pixels and an exception among all VGA modes available (320x200, 640x200, 640x350 and 720x400 for Text). Square pixels weren't the standard back then.

Adding video modes in later (Super) VGA was kind of a marketing game to offer higher numbers to outpace competition. First it was Colour, like offering 640x480 in 256 colours,later it became as much about higher resolutions. At the time, there where many other resolutions seen as 'standard' beyond 800x600, like 1024x600, 1152x768 or 1280x800 - the later was quite prominent, as it fits (almost) exactly 1 MiB of RAM in 8 bit mode.

1280x1024 is again based on nice power of two values (like 1024x768), thus easy to handle and maximizing the use of RAM, as it exactly filled 1.25 MiB in 8 Bit Mode and 2.5 MiB in 16 Bit Mode (*1) Both sizes could be well added as a series of 5 RAM chips to the card. In addition it's worth considering that later single chip VGA designs were usually fully programmable, thus able to offer next to any resolution (within their pixel clock that is).

Some years later 1280x1024 got a revival when upcomming (relative) low cost LCD manufacturing process passed the 1024x748 ability for 15". After all, such LCDs don't react as flexible as CRTs to various pixel per line rates.


*1 - Mostly forgotten today, but 16 bit colour was a huge thing - for a few years :)

13
  • 2
    Do you know why 1280×1024 was preferred to 1280×960 (apart from occupying more of the available memory), or why the “standard” 1152×864 workstation resolution never took off on PCs? I’m also curious about 1280×800 — I’m not aware of a SuperVGA board which supports that (before they became fully programmable); where was it prominent? Commented Jul 3, 2019 at 9:46
  • 2
    No hard facts, I would go with the usual 'more is better' aproach. Especially as a light distortion of ~6% horizontal is next to invisible to most users. 1152x864 is a great reminder. Not sure either. 1280x800: Back in the mid 90s it could be ordered with stock SIEMENS clones (PCD-3*) for example.
    – Raffzahn
    Commented Jul 3, 2019 at 9:56
  • 4
    I remember 1280x800 as being prominent in laptops and possiblly LCD monitors from the mid to late 2000s. Commented Jul 3, 2019 at 17:10
  • 1
    "Both sizes could be well added as a series of 5 RAM chips to the card." Did such video cards actually exist? I only remember video cards with even multiples of 1 MiB of memory. Commented Jul 3, 2019 at 18:55
  • 1
    @StephenKitt why? Because it provides more (almost a vertical inch at 72 dpi) "screen" space.
    – RonJohn
    Commented Jul 4, 2019 at 0:35
13

1280x1024@24-bit fits in 4 MiB. Why wouldn't you take extra screen space?

Keep in mind that games didn't usually run in 1280x1024 at the time. Back before LCDs became the dominating screen technology, you didn't care about the "native resolution" of the display - you didn't get the ugly "one pixel is stretched over two physical pixels, its neighbour only has one physical pixel". Even modern LCDs tend to look awful at their non-native resolutions (not helped by the use of sub-pixel rendering of fonts etc.), but the same thing wasn't true with CRTs.

So you had your workstation running 1280x1024, giving you the most of your 4 MiB graphics card. And when you wanted to play a game, it switched the resolution to something like 800x600 or 1024x768. The image was just as clear in both cases, you didn't get any weird aliasing or sizing artifacts.

This changed when LCDs came around; native resolution is a big deal on LCDs. Running a 800x600 game on an 1280x1024 LCD (of the time, though many quality issues remain to this day) will result in many artifacts. But here's the thing - LCDs weren't for games. No gamer would ever voluntarily use an LCD - they were designed for office use (and of course, portable computers - but that's a whole another can of worms). They had poor colours, poor response times, bad aliasing issues and couldn't adapt to other resolutions well. It made perfect sense to give the most screen space possible, which with the (then rather standard) 4 MiB graphics cards was 1280x1024@24-bit (while also giving nice fallback to 16-bit and 8-bit for smaller VRAM).

It took many years for LCDs to become mainstream even among gamers (mostly because of their convenience - size, weight, cost etc.). By then, 1280x1024 was already the standard, and most importantly, games became largely aspect ratio agnostic anyway. The next big jump had to wait for people watching movies on their computers, which helped the move to 16:9 and 16:10 (for that handy extra bit of vertical space).

Of course, there are other possible options that have pretty much the same benefits (or take other trade-offs). In the end, it's just that one of those dominated the others. Following the leader is often a good idea, since it makes it easier to reach a bigger portion of the market.

8

Quoting Wikipedia, it seems to be at least two factors:

The availability of inexpensive LCD monitors has made the 5:4 aspect ratio resolution of 1280 × 1024 more popular for desktop usage during the first decade of the 21st century.

(from here)

The 1280 × 1024 resolution became popular because at 24 bit/px color depth it fit well into 4 megabytes of video RAM. At the time, memory was extremely expensive. Using 1280 × 1024 at 24-bit color depth allowed using 3.75 MB of video RAM, fitting nicely with VRAM chip sizes which were available at the time (4 MB): (1280 × 1024) px × 24 bit/px ÷ 8 bit/byte ÷ 2^20 byte/MB = 3.75 MB

(from here)

7
  • 6
    I believe 1280*1024 predated commonplace LCDs - my 1994 Cirrus Logic GD5428 supported it (ref)
    – Jonathan
    Commented Jul 3, 2019 at 8:55
  • 2
    Also, they could do some 4:3 resolution to fix 4MB (about 1365*1024, obviously rounded for 8-multiples)/1024, like they did 1152*864 for almost exactly 1 megapixels.
    – Jonathan
    Commented Jul 3, 2019 at 8:59
  • Well, I'm just trying to answer the question by providing some arguments that seem reasonable. It's true that the 1280x1024 was around long ago, but it's also true that cheap monitors that supported that were common in late 90's and early 2000's. Bought some of them myself. OTOH, the OP asks "why the resolution was sort of standard" not "why didn't they keep making standards with 4:3 aspect ratio".
    – DroidW
    Commented Jul 3, 2019 at 9:05
  • Well the OP (=me) did mean "why not 4:3" :) I edited the question to clarify.
    – Jonathan
    Commented Jul 3, 2019 at 10:31
  • 3
    @Jonathan The 1280x1024 5:4 aspect ratio display, with a square pixels, only became a thing with 1280x1024 5:4 aspect ratio LCD panels. On 4:3 aspect ratio CRTs, 1280x1024 has non-square pixels. This was actually a problem for games, since 1280x1024 could be displayed on either a 5:4 or 4:3 display and there was no way to infer which was the case. They either had to assume an aspect ratio (and so have a distorted display on the other), have two 1280x1024 choices, or allow selecting the aspect ratio separately.
    – user722
    Commented Jul 3, 2019 at 16:47
2

As others have said, with limited video memory, that extra video mode gives you flexibility to choose 1280×1024@16bpp over 1600×1200@8bpp depending on your needs.

But it also allows you to choose an optimal refresh rate. If screen flicker bothers you and your monitor or video card is limited to a 72 kHz horizontal scan rate, you might choose 1280×1024@70Hz vertical refresh over 1600×1200@60Hz. Trading pixels for higher refresh rates can be just as good of a compromise as trading pixels for more colors.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .