35

Before the age of LCDs, PC displays almost always targeted 4:3 CRT displays. And indeed, VGA and most super-VGA modes had 4:3 aspect ratio resulting in square pixels - except the odd 5:4 1280×1024 mode. This mode even caused 17″/19″ LCD monitors to actually be 5:4, before widescreen 16:9 took over.

How did this odd resolution become sort-of standard? Edit: Why didn't they go with a corresponding 4:3 resolution instead, like 1280×960 or 1360×1020 (to fit in 4MB for 24-bit)?

To clarify, here are the resolutions as I remember them:

┌───────────────────┬──────────────────────┬──────────────┐
│                   │ Resolution           │ aspect ratio │
├───────────────────┼──────────────────────┼──────────────┤
│ (pre─VGA madness) │ various              │ various      │
│ VGA               │ 640×480              │ 4:3          │
│ Super VGA         │ 800×600              │ 4:3          │
│ Super VGA         │ 1024×768             │ 4:3          │
│ Super VGA         │ 1152×864             │ 4:3          │
│ Super VGA         │ 1280×1024 ◀◀◀◀◀      │ 5:4 ◀◀◀◀◀    │
│ Super VGA         │ 1600×1200            │ 4:3          │
│ Widescreen        │ 1920×1080 and others │ 16:9         │
└───────────────────┴──────────────────────┴──────────────┘
user3840170
  • 23,072
  • 4
  • 91
  • 150
Jonathan
  • 703
  • 6
  • 9

5 Answers5

29

VGA's 640x480 mode was the first to offer square pixels and an exception among all VGA modes available (320x200, 640x200, 640x350 and 720x400 for Text). Square pixels weren't the standard back then.

Adding video modes in later (Super) VGA was kind of a marketing game to offer higher numbers to outpace competition. First it was Colour, like offering 640x480 in 256 colours,later it became as much about higher resolutions. At the time, there where many other resolutions seen as 'standard' beyond 800x600, like 1024x600, 1152x768 or 1280x800 - the later was quite prominent, as it fits (almost) exactly 1 MiB of RAM in 8 bit mode.

1280x1024 is again based on nice power of two values (like 1024x768), thus easy to handle and maximizing the use of RAM, as it exactly filled 1.25 MiB in 8 Bit Mode and 2.5 MiB in 16 Bit Mode (*1) Both sizes could be well added as a series of 5 RAM chips to the card. In addition it's worth considering that later single chip VGA designs were usually fully programmable, thus able to offer next to any resolution (within their pixel clock that is).

Some years later 1280x1024 got a revival when upcomming (relative) low cost LCD manufacturing process passed the 1024x748 ability for 15". After all, such LCDs don't react as flexible as CRTs to various pixel per line rates.


*1 - Mostly forgotten today, but 16 bit colour was a huge thing - for a few years :)

Raffzahn
  • 222,541
  • 22
  • 631
  • 918
  • 2
    Do you know why 1280×1024 was preferred to 1280×960 (apart from occupying more of the available memory), or why the “standard” 1152×864 workstation resolution never took off on PCs? I’m also curious about 1280×800 — I’m not aware of a SuperVGA board which supports that (before they became fully programmable); where was it prominent? – Stephen Kitt Jul 03 '19 at 09:46
  • 2
    No hard facts, I would go with the usual 'more is better' aproach. Especially as a light distortion of ~6% horizontal is next to invisible to most users. 1152x864 is a great reminder. Not sure either. 1280x800: Back in the mid 90s it could be ordered with stock SIEMENS clones (PCD-3*) for example. – Raffzahn Jul 03 '19 at 09:56
  • 4
    I remember 1280x800 as being prominent in laptops and possiblly LCD monitors from the mid to late 2000s. – Peter Green Jul 03 '19 at 17:10
  • 1
    "Both sizes could be well added as a series of 5 RAM chips to the card." Did such video cards actually exist? I only remember video cards with even multiples of 1 MiB of memory. – snips-n-snails Jul 03 '19 at 18:55
  • 1
    @StephenKitt why? Because it provides more (almost a vertical inch at 72 dpi) "screen" space. – RonJohn Jul 04 '19 at 00:35
  • 1
    I had used 1280×1024@24bpp (= 3,932,160 bytes) for many years on a video card with a 4 MiB frame buffer – Nayuki Jul 04 '19 at 03:19
  • @RonJohn not on CRTs; graphics cards started supporting 1280×1024, and it was “canonised” in the VBE, long before 5:4 LCD screens were available. It only provides more pixels in the same area on CRTs, and results in oval “circles” etc. – Stephen Kitt Jul 04 '19 at 05:29
  • It was too long ago, so I don't remember exactly, but what about 320x240 ? Didn't that come before 640x480 ? At least on the Amiga 500 that was the standard resolution and I think it was there before 640x480 (but I might remember incorrectly) – ChatterOne Jul 04 '19 at 09:42
  • @ChatterOne Well, the Sun-1 had 640x480x8 already in 1982 ... but this isn't about resolutions of various other computers but PC's (Super)VGA. The PC started out with 160/320/640x200 for CGA, followed by 640x350 by EGA until VGA's added a 640x480 (by 4 !) mode – Raffzahn Jul 04 '19 at 10:19
  • 1
    @ChatterOne VGA can be programmed to output 320×240 (and many other resolutions) but it’s not one of the VGA BIOS-provided resolutions. – Stephen Kitt Jul 04 '19 at 10:42
  • @StephenKitt that's why I put screen in quotes. After all, 1024 is greater than 960. – RonJohn Jul 04 '19 at 14:51
  • 1
    @RonJohn OK, I thought the reference to dpi meant you were talking about physical screen size. More pixels are useful indeed. What I’m curious about here is the context in which the resolution was introduced — it seems to me that the main markets for high-res in the 80s and early 90s would have cared quite a bit about any distortion (and it might have been dealt with in software, I don’t know). – Stephen Kitt Jul 04 '19 at 18:00
  • 1
    @StephenKitt Also don't discount marketing. I had a professor teaching graphics who used to complain about the University not having proper 'megapixel' displays. Having over 1000x1000 would have been important – GrantB Sep 05 '20 at 21:55
13

1280x1024@24-bit fits in 4 MiB. Why wouldn't you take extra screen space?

Keep in mind that games didn't usually run in 1280x1024 at the time. Back before LCDs became the dominating screen technology, you didn't care about the "native resolution" of the display - you didn't get the ugly "one pixel is stretched over two physical pixels, its neighbour only has one physical pixel". Even modern LCDs tend to look awful at their non-native resolutions (not helped by the use of sub-pixel rendering of fonts etc.), but the same thing wasn't true with CRTs.

So you had your workstation running 1280x1024, giving you the most of your 4 MiB graphics card. And when you wanted to play a game, it switched the resolution to something like 800x600 or 1024x768. The image was just as clear in both cases, you didn't get any weird aliasing or sizing artifacts.

This changed when LCDs came around; native resolution is a big deal on LCDs. Running a 800x600 game on an 1280x1024 LCD (of the time, though many quality issues remain to this day) will result in many artifacts. But here's the thing - LCDs weren't for games. No gamer would ever voluntarily use an LCD - they were designed for office use (and of course, portable computers - but that's a whole another can of worms). They had poor colours, poor response times, bad aliasing issues and couldn't adapt to other resolutions well. It made perfect sense to give the most screen space possible, which with the (then rather standard) 4 MiB graphics cards was 1280x1024@24-bit (while also giving nice fallback to 16-bit and 8-bit for smaller VRAM).

It took many years for LCDs to become mainstream even among gamers (mostly because of their convenience - size, weight, cost etc.). By then, 1280x1024 was already the standard, and most importantly, games became largely aspect ratio agnostic anyway. The next big jump had to wait for people watching movies on their computers, which helped the move to 16:9 and 16:10 (for that handy extra bit of vertical space).

Of course, there are other possible options that have pretty much the same benefits (or take other trade-offs). In the end, it's just that one of those dominated the others. Following the leader is often a good idea, since it makes it easier to reach a bigger portion of the market.

Luaan
  • 767
  • 5
  • 10
8

Quoting Wikipedia, it seems to be at least two factors:

The availability of inexpensive LCD monitors has made the 5:4 aspect ratio resolution of 1280 × 1024 more popular for desktop usage during the first decade of the 21st century.

(from here)

The 1280 × 1024 resolution became popular because at 24 bit/px color depth it fit well into 4 megabytes of video RAM. At the time, memory was extremely expensive. Using 1280 × 1024 at 24-bit color depth allowed using 3.75 MB of video RAM, fitting nicely with VRAM chip sizes which were available at the time (4 MB): (1280 × 1024) px × 24 bit/px ÷ 8 bit/byte ÷ 2^20 byte/MB = 3.75 MB

(from here)

DroidW
  • 509
  • 3
  • 10
  • 6
    I believe 1280*1024 predated commonplace LCDs - my 1994 Cirrus Logic GD5428 supported it (ref) – Jonathan Jul 03 '19 at 08:55
  • 2
    Also, they could do some 4:3 resolution to fix 4MB (about 13651024, obviously rounded for 8-multiples)/1024, like they did 1152864 for almost exactly 1 megapixels. – Jonathan Jul 03 '19 at 08:59
  • Well, I'm just trying to answer the question by providing some arguments that seem reasonable. It's true that the 1280x1024 was around long ago, but it's also true that cheap monitors that supported that were common in late 90's and early 2000's. Bought some of them myself. OTOH, the OP asks "why the resolution was sort of standard" not "why didn't they keep making standards with 4:3 aspect ratio". – DroidW Jul 03 '19 at 09:05
  • Well the OP (=me) did mean "why not 4:3" :) I edited the question to clarify. – Jonathan Jul 03 '19 at 10:31
  • So sorry for being unable to properly decode the intended meaning of the question. – DroidW Jul 03 '19 at 10:48
  • 3
    @Jonathan The 1280x1024 5:4 aspect ratio display, with a square pixels, only became a thing with 1280x1024 5:4 aspect ratio LCD panels. On 4:3 aspect ratio CRTs, 1280x1024 has non-square pixels. This was actually a problem for games, since 1280x1024 could be displayed on either a 5:4 or 4:3 display and there was no way to infer which was the case. They either had to assume an aspect ratio (and so have a distorted display on the other), have two 1280x1024 choices, or allow selecting the aspect ratio separately. –  Jul 03 '19 at 16:47
  • fitting in memory is important. That's why 1360x768 exists since at 8bpp it fits into 1MB of memory whereas 1366x768 requires 1024.5KB – phuclv Jan 29 '20 at 16:29
6

The concept of 5:4 aspect ratio, and 1280x1024 graphics coordinates, is actually much older than you think.

It dates back at least to the BBC Micro introduced at the end of 1981; it had graphics modes designed around the PAL TVs used in the UK, with 160x256, 320x256, and 640x256 modes, all with a standard coordinate system of 1280x1024 for easy graphics programming. At 8x8 character size, this allowed an 80x32 text display on affordable hardware at home, better than many of the cheaper dumb terminals. The Acorn Archimedes, which succeeded the BBC Micro in the late 1980s, extended this capability to 640x512 with PAL TV timings, as well as supporting VGA/SVGA resolutions when connected to a PC-type monitor.

These resolutions were very easy to implement on PAL, using a 16MHz master dot clock, since the time between horizontal sync pulses is exactly 64µs, and there are slightly more than 512 lines (divided between the two interlaced fields) in the display-safe area. This relatively high level of capability was used by the BBC to generate broadcast TV graphics during the early to mid 1980s.

By 1984, early SGI IRIS workstations supported high-resolution graphics, with - in particular - 1024 rows of pixels:

The IRIS 1400 comes standard with 1.5 MB of CPU memory, 8 bit-planes of 1024x1024 image memory, and…

It's not immediately clear whether all of these could be displayed simultaneously in practice at the time. More likely, a section of the framebuffer could be selected for display output, the size of that section depending on the output device's capabilities.

Apple introduced what was then considered a very high-resolution monochrome display in 1989, supporting just 1152x870 resolution (in a 4:3 aspect ratio), a size most likely designed to just fit in a megabit of RAM. A special modification to the Acorn Archimedes series allowed it to support 1152x896 (close to 5:4 aspect ratio) on a particular monitor, probably very similar to the one made by Apple; the Archimedes allocated display memory in system DRAM, so it didn't have a hard megabit limit as a Mac's graphics card did.

As the availability of fast and affordable memory became less of a restriction on graphics capabilities in the 1990s, it is notable that 1280x1024 with a 5:4 aspect ratio was specifically catered for by high-end monitor vendors. If three bytes per pixel are used to support 24-bit truecolour, moreover, this is a resolution that fits comfortably in a comparatively affordable 4MB of VRAM. CRTs could easily be built this way, as the natural shape of a glass tube is circular, thus the squarer the aspect ratio the easier the CRT was to make. This also did not restrict the display from handling 4:3 aspect ratios cleanly, just leaving a slightly different pattern of blank borders at the edges. Once 1280x1024 was established, LCD monitors for computer use were made to support it (in contrast to those for televisions) and these are still available today.

The slightly taller aspect ratio is useful in text modes, where programmers appreciate having more lines of code on screen more than they do having more columns, and also in desktop environments where menus and toolbars have a habit of consuming vertical space more often than horizontal. The present trend towards wider aspect ratios, by contrast, is driven by the movie and gaming industries which want to cater for human peripheral vision, and thus immersion in the scene, rather than for display of information.

Chromatix
  • 16,791
  • 1
  • 49
  • 69
  • Very interesting! However, were there really 5:4 CRTs (from "high-end monitor vendors")? I remember 1280×1024 used on "regular" 4:3 CRTs, with the pixels slightly squished so they'd fit. – Jonathan Sep 07 '20 at 14:34
  • And why didn't those vendors support higher-res 4:3 resolutions that'd fir in 4MB 24-bit, like 1280×960 or 1360×1020? – Jonathan Sep 07 '20 at 14:35
  • @Jonathan There was at least a 17" Sony model that was designed to accommodate a 5:4 aspect ratio. As I noted in my answer, it's not terribly difficult to make them that way, and you can actually use 4:3 CRTs by just leaving a bit of space on the sides (as the BBC Micro did on TVs). And indeed, CRT monitors can be used with any resolution that fits within its refresh and signal bandwidth specs, but 1280x1024 was chosen as the standard - probably because it's a nice round pair of numbers in binary (10100000000x10000000000). Computers like binary numbers. – Chromatix Sep 07 '20 at 14:49
  • @Jonathan: I think the 1280 was chosen because it is conveniently equal to 16 times 80, meaning that one can show eighty 16-dot wide characters per line. The 1024 was chosen because it would allow a program to use a 1024x1024 area for plotting (in cases where power of two sizes are convenient) and have a 256-pixel-wide status area to the side. – supercat Sep 07 '20 at 18:44
  • I just measured the visible area on my Trinitron 300sf CRT, which can do 1280x1024 @ 75Hz (that's 80khz horizontal!). It's 38.5cm x 29.0cm which works out to an aspect ratio of 1.3275 -- almost exactly a 4:3 aspect. So on this display, 1280x1024 would have very slightly non-square pixels. – kiwidrew Sep 12 '20 at 02:30
  • Another odd resolution is 1152x900, as used in most Sun workstations of the late 80s/early 90s vintage (eg the SPARCstation 2s of my university all used it, presumably because the university was to cheap to shell out for more thsn 1MB of RAM for the video cards, as AIUI the hardware installed could handle higher resolutions if adequate memory was installed). – occipita Sep 12 '20 at 22:27
2

As others have said, with limited video memory, that extra video mode gives you flexibility to choose 1280×1024@16bpp over 1600×1200@8bpp depending on your needs.

But it also allows you to choose an optimal refresh rate. If screen flicker bothers you and your monitor or video card is limited to a 72 kHz horizontal scan rate, you might choose 1280×1024@70Hz vertical refresh over 1600×1200@60Hz. Trading pixels for higher refresh rates can be just as good of a compromise as trading pixels for more colors.

snips-n-snails
  • 17,548
  • 3
  • 63
  • 120