23

When I was a kid, the "real" computers in movies looked so cool with that 80-column monochrome green text.

My first computer was a VIC-20 and it always felt very "toy-like" to me because of its 22 columns of fat text. Later my C64 was a better 40 columns, but still not the cool 80-column text I'd see on "real" computers in movies and TV.

I recently started playing with the VICE Commodore emulator and I discovered that the Commodore PET had a cool 80-column screen! I was very surprised to see that.

Why did the VIC-20 and C64 only have 22 & 40 columns when the earlier PET had 80 columns?

user3840170
  • 23,072
  • 4
  • 91
  • 150
AvaTaylor
  • 465
  • 4
  • 9
  • 5
    The price of memory? – dave Mar 30 '21 at 00:37
  • 10
    Just a comment as I don't have time to properly research a full answer: VIC-20 was "very low end" in certain respects and one way to seriously cut costs was for both RAM/circuitry to support 40 (ignore 80) columns, plus it provided a difference from the C64. As for the C64 40 column, that was actually common on a lot of low-end computers designed for display on ordinary TVs via RF-modulators - 80 column generally required real "computer monitors". – manassehkatz-Moving 2 Codidact Mar 30 '21 at 00:45
  • 1
    VIC-20 has 22 columns. – Tim Locke Mar 30 '21 at 01:00
  • 3
    There is frustratingly little information online about the 80-column PETs, which were an improvement launched in 1980 (and, pedantically, therefore at the same time as the VIC-20, not before). But, at a guess: because they don't need to fetch pixels from RAM, being a pure text-mode machine, and therefore have a lot more bandwidth for fetching character codes, and possibly even carried across the earlier PETs' limitation of requiring the programmer manually to avoid intersecting with video fetch. The Vic-20 and C64 are self-regulating, at a cost of halving theoretical video bandwidth. – Tommy Mar 30 '21 at 02:41
  • Does anyone have information about the price for a 80-column PET? I suspect it was higher than the introduction price of VIC-20. – UncleBod Mar 30 '21 at 06:41
  • 6
    Short answer: TV. – Patrick Schlüter Mar 30 '21 at 11:28
  • 1
    If you had the 8K expansion module for the VIC-20, there was a program you could load to convert the screen to 40 characters. – Bjorn Mar 30 '21 at 16:00
  • 1
    It is only the 8032s that had 80 columns. The original PETs only had 40 columns. When they were launched, 8032s cost about £1000 in the UK The 4032s cost less but I suspect this was just a marketing ploy. – cup Mar 31 '21 at 07:51
  • @Bjorn: If you wanted a cassette-based e-book reader, even an unexpanded VIC-20 would have just enough RAM to show a 40x32 display. – supercat Apr 01 '21 at 07:45
  • @supercat I was in college at the time, using the VIC-20 as a terminal to the mainframe over a 110 baud modem. Programming COBOL on a 22 character wide screen was a challenge, to say the least. – Bjorn Apr 01 '21 at 12:23
  • @Bjorn: I've used the VIC-20 as a terminal also, though it was for VAX BASIC. Was the 110 baud a limit of the host to which you were connecting? I would think 300 would have been common by the time the VIC-20 was available. – supercat Apr 01 '21 at 14:10
  • @supercat The Commodore modem for the VIC-20 was max 110 baud. I was connecting to a DECsystem-10. Its modem bank was capable of 300 baud at the time (early 80's). – Bjorn Apr 01 '21 at 14:16
  • @Bjorn: I had one back in the day (and probably still have it), and routinely used it at 300 baud. According to Wiki, the 300-baud Bell 103 modem was released in 1962. Given that even a 110-baud teletypes like the ASR-33 would have no trouble communicating with each other over a Bell 103 modem, I can't see any reason for using anything slower than a Bell 103 modem in a product released almost 20 years later. – supercat Apr 01 '21 at 14:43

8 Answers8

62

One reason was likely that the VIC-20 and C64 did not have their own displays, but were designed to be connected to a television set. The interface between the computer and the television was not sufficient to display 80 column text (it would have been almost unreadable). However, the PET had its own integrated display so it did not have this limitation, and they were able to offer an 80 column option.

Greg Hewgill
  • 6,999
  • 1
  • 28
  • 32
  • 7
    BBC-b mode 0 and mode 3 were 80 character on a tv. (it was a bit hard to read, but not really "unreadable") – James K Mar 30 '21 at 09:43
  • 3
    I think this answer needs more detail: first, TV's had poor resolution - see @Patrick Schlüter answer - and then at this time home computers were niche and VCRs barely existed, so domestic TV's wouldn't have any 'Video IN' connection; the display signal would also have to go through an RF modulator and the TV's tuner, the latter not being optimised for text. – Lou Knee Mar 30 '21 at 14:51
  • 6
    @LouKnee RF modulators do not add much distortion (as could be seen by watching TV :). The main problem is that the composite video signal (where colour and brightness is in the same wire, whether RF modulated or not) places a hard limit on the horizontal resolution. The PET display was likely a green screen (i.e. monochrome), with no need to conform to the broadcast TV standards such as PAL or NTSC that the VIC-20 and C64 did. – Reversed Engineer Mar 30 '21 at 15:36
  • 2
    @ReversedEngineer: RF modulators can be designed so as to generate clean broadcast-quality video signals, but RF modulators can be made much more cheaply if dirty signals are acceptable. – supercat Mar 30 '21 at 17:14
  • @supercat Few real television receivers could resolve the full quality of a clean RF signal. If you bypassed all the RF stuff, you could adapt a monochrome set to display 80 character lines. Been there, done that, and it was OK, but interlace made it a bit of a dizzy experience, and the ringing of the notch filter intended to reject the NTSC color subcarrier was noticeable. – John Doty Mar 30 '21 at 18:30
  • @JohnDoty: Many television receivers could resolve the difference between e.g. taking the video output from a modded Atari 2600 and passing it through a good RF modulator, versus putting it through the minimalist RF modulator built into the machine. – supercat Mar 30 '21 at 20:23
  • Yes, as long as it's a monochrome set, or you can remove the colour subcarrier before RF modulating it. Then the limit in horizontal resolution caused by the colour encoding and decoding do not apply. I also did that for my ZX80 and ZX81 (although resolution was so low it wasn't a problem :) – Reversed Engineer Mar 31 '21 at 10:47
  • You might note that US NTSC TVs were lower resolution (525 lines) than European PAL TVs (625 lines). – mikado Mar 31 '21 at 13:36
  • 1
    @JamesK “It was a bit hard to read” — for large values of ‘a bit’ :-)  It probably depended on your TV, of course, but I think most Beebs were connected to relatively small screens, where MODE 0/3 could be deciphered but IIRC wasn't really suitable for sustained use. – gidds Apr 01 '21 at 00:30
  • 1
    The goal was to produce adequately readable text on every computer sold. With control over the video circuitry and CRT one can use standard TV technology to display 80 columns fine. Most later quality TVs would too. But no guarantee. So it must support the lowest common denominator. Similar thing with the number of lines. Many early consoles and home computers have about 200 visible scanlines on NTSC. Many quality TVs will display the full 240 just fine. But you never knew if someone wanted to use a badly adjusted 1973 portable. Related: https://en.wikipedia.org/wiki/Title_safe – RETRAC Apr 02 '21 at 04:46
  • 1
    @RETRAC Exactly... VIC-20 et al. were mass-market products meant to be plug-and-play by beginners, often the kids in the family. This was 1980: a "badly adjusted 1973 portable" TV is a common display option, and a clapped-out 1960's hand-me-down is not unlikely... – Lou Knee Apr 09 '21 at 13:54
  • @ReversedEngineer: RF modulators do not add much distortion (as could be seen by watching TV :) I don't recall any broadcast TV programmes featuring 80 column text! The football results were what, 32 cols maybe? – Lou Knee Apr 09 '21 at 14:08
  • @LouKnee Even bypassing the broadcast TV transmitter (or RF modulator doing the same thing here) and RF tuner in your TV, it would be very difficult to get 80 column text on a TV because of the limitations of the low frequency composite video signal (CVBS) format. If you plugged a local computer straight into a colour monitor using the (these days) yellow "video in" socket (no transmitter/modulator at all), displaying 80 characters would be very difficult. What I'm mean is that the radio frequency (RF) part of the chain is not the problem here. – Reversed Engineer Apr 11 '21 at 13:57
  • The Beeb was initially primarily a schools machine and was often connected via RGB to a good monitor which had no problem with 80 cols. – Alan B Jan 20 '22 at 17:18
31

Use of TV as monitor is the reason for these low resolutions.

The issue is that the color resolution of TV is very low. While B&W TV could resolve pixel small enough for ~400 to ~600 pixels, color resolution was much much lower, barely around 200 pixels. Some computers exploited this, such as Apple II and CGA/composite. It also shows the limits of that system as Apple was only able to display 140 pixel columns in 6 colours (16 colours in lores and double hires) and CGA which showed 160 pixels in 16 colours. So, if a computer wanted to display text without colour interference, it could not afford more than that number of pixels per line.

Further reading on color artifacts: Wikipedia article, and The 8bit Guy's video on CGA on composite output.

Patrick Schlüter
  • 4,120
  • 1
  • 15
  • 22
17

Space on a chip die was very limited - you couldn't just put an unlimited amount of functionality into one chip.

If you have a closer look at video chips used in 8 bit computers, they either feature an 80 column mode (PET, Amstrad CPC, later Apple IIs, C128 VDC) or a lot of game related functionality like sprites, hardware scrolling, raster interrupts etc. (C64, Atari 8 bit). There's not a single one I know of that does both.

Additionally, Commodore's 8 bit computers run their CPU and the video chip at about 1 Mhz. That is not fast enough to display 80 characters per line, so to get an 80 column display on the PET, they had to use tricks (see here for more details) - there was probably no space left on the chip to support that.

And the C64 is probably the only Commodore computer where somebody at Commodore did actual market research and put down a list of targets they wanted to reach with their machine. Since the machine was targetting the consumer market not the business market and since it was supposed to be a more powerful follow-up of the VIC-20 (which had only 22 columns), I don't think anybody ever really worried about 80 columns.

Venner
  • 1,684
  • 7
  • 7
  • How much die space do you think it would really take to double the horizontal resolution? The VIC II has the two bytes already there (one byte for an attribute, one byte for 8 pixels). They could have maybe clocked the shift register at twice the speed, and then half way through the charcell, load the shift register with the other byte. A bit weird from the programmer's point of view, but it would effectively give an 80 col mode. – Omar and Lorraine Mar 30 '21 at 06:29
  • 1
    A note, Amstrad CPC supported both 80 columns and hardware scrolling, I don't know if it utilized two chips for this, like C128 did. – Krackout Mar 30 '21 at 07:31
  • The Amstrad CPC is combining an off the shelf chip (Motorola 6845, IIRC) for text display and a custom gate array that handles pixel graphics. It does have some sort of hardware scrolling, but that is so limited that most games implmented scrolling in software. The CPC does not have hardware sprites. – Venner Mar 30 '21 at 12:37
  • 1
    @OmarL: Increasing the horizontal resolution of the VIC-20 to 40 columns while keeping reprogrammable character shapes and without effectively halving the CPU speed would have required at minimum adding 40x12 bits of storage within the VIC chip to hold character and attribute data (as was done in the VIC-II). That's a fair bit of die space. It might have been possible to move a couple more of the VIC-20's SRAM chips onto their own data bus (as was done with the color memory) and arrange to have data from them get fed to the VIC-II chip, but that would mean adding a minimum of two more chips. – supercat Mar 30 '21 at 15:11
  • 1
    @OmarL: Adding an 80-column mode on the VIC-II while keeping the existing memory-interface architecture would have required doubling the text-line buffer from 480 bits to 960, and even with the extra buffering, such a mode would only be usable if there were a blank scan line for every text row, and would gobble enough memory bandwidth to cut CPU performance in half. Memory speeds might have allowed the VIC-II to perform double fetches from consecutive bytes in each of its time slots, but even if the VIC-II took over every available cycle leaving nothing for the CPU... – supercat Mar 30 '21 at 15:36
  • 1
    ...that would only get up to 177 per scan line. Practical 80-column systems need to either use a pair of memory banks, fetch character-shape data from somewhere other than the CPU bus, or put display memory on its own bus separate from the CPU. None of those choices would have been particularly cheap in a design where the same chip generates video addresses and handles video signal generation. If the roles of address generation and video output had been split, interposing a character ROM between them would have been much more practical. – supercat Mar 30 '21 at 15:40
  • 1
    @OmarL: For every scan line of every character position, the VIC-II needs 20 bits of data, twelve of which will be the same for the eight scan lines in each row. Fetching 80 character bytes would be possible, but the VIC-II would also need 80 shape bytes to go with them. – supercat Mar 30 '21 at 16:20
  • If it wasn't for the cost they could have used a 16 bit bus for the video chip. There's one system that did something in that way. The DAI had 32 K of the upper memory accessible to the CPU in 8 bit but the video chip accessed it 16 bit wise. This allowed quite impressive graphic capabilities and still have shared memory access for the CPU. – Patrick Schlüter Mar 30 '21 at 16:21
  • 1
    @OmarL: The C64 uses the main system memory bus to load shape data as well as character data. One one scan line out of every eight, the VIC-II chip uses 40 cycles to load character and attribute data (stealing 40 cycles from the CPU). On the other seven, it just fetches shape data. If the VIC-II chip needs to spend 80 cycles per line fetching shape data, when is it going to fetch character data? – supercat Mar 30 '21 at 16:45
  • 1
    @OmarL: So you're suggesting adding 1000 bytes of on-die RAM? In the die shot at http://visual6502.org/images/8565/Commodore_8565_die_shot_20x_1a_8500w.jpg the rectangle near the lower-left that's about as tall as 5 I/O pins and as wide as 1.5 is the 40 byte row buffer. A thousand-byte display RAM would be about 25 times as big as that part of the chip. Commodore could probably have designed a chip with that much RAM, but it would have added considerably to the price. And unless one wants blank scan lines, one would need 2000 bytes, which would be twice as big. – supercat Mar 30 '21 at 17:21
  • 1
    @OmarL: You said "Fetch 40 chars from the internal memory". What internal memory, and where would its contents come from? – supercat Mar 30 '21 at 20:21
12

The monitor is physically attached - part of the unit

enter image description here

There's no video port. You have to use the built-in monitor.

As such, Commodore simply chose a monitor fit for 80 columns. That wasn't hard, since NTSC monochrome composite video has unlimited horizontal resolution. And 7 vertical scan lines per character form (+1 scan line for spacing) is enough to be readable and allow 24-25 lines of text.

RF-modulated NTSC has its resolution limited by RF bandwidth. Color NTSC has its horizontal resolution limited by the color clock frequency.

Note that the above is a 40-column PET, but it shows how the monitor is permanently attached.

4

Color monitors and televisions have a limited horizontal resolution. Monochrome monitors are much sharper and basically have unlimited horizontal resolution. The 80 column PET uses a chip designed to drive a computer monitor rather than a television. The 80 column PET uses essentially the same chip that IBM used in the IBM PC for the Monochrome Display Adapter and the Color Graphics Adapter (Motorola 6845). It outputs digital signals for video and horizontal and vertical sync.

The VIC20 and C64 use chips that are designed to output a PAL or NTSC television signal. Televisions overscan the video - it means TV doesn't display the whole video image but runs off the edges of the screen. The amount can vary between televisions and that is why there are large borders around the C64 and VIC20 screens.

Martin Maly
  • 5,535
  • 18
  • 45
hal9000w
  • 41
  • 1
  • The 6485 doesn't generate video, but merely supplies addresses and an intra-row scan line count. It would be perfectly suitable for use driving a composite monitor (and indeed the CGA was intended to be usable with composite monitors, though IBM somewhat botched the colorburst generation). While it generates digital signals for sync, so too does the discrete logic circuitry used in the Apple II. – supercat Jan 20 '22 at 05:43
3

Somewhat of a side note - The Atari 8 bit series (400 | 800 | 65XE | 130XE - 6502 based, running at 2Mhz), normally displayed 40 characters per line on a color TV. There are some terminal type programs that horizontally scroll the 40 line display to get a virtual 80 character per line terminal, or use a 3 by 5 font (with 1 pixel between characters) (from such programs, it wasn't native to the Atari) to get 80 characters per line, but it is difficult to read.

The terminal programs were used for some modem based bulletin board programs, and also for the somewhat rare ATR8000, which had a Z80 for CP/M, and option an add-on board with an 8088 + 512MB ram, that could run MSDOS 2.11. The ATR8000 could also work with a standard ASCII terminal.

I'm wondering if the C64 could have use a 3 by 5 or similar font to get 80 characters per line.

rcgldr
  • 631
  • 3
  • 6
  • 1
    Your comment unearthed a long-forgotten memory of such a terminal for the C64. Maybe http://mikenaberezny.com/hardware/projects/c64-soft80/ was it? – Chris K Apr 01 '21 at 19:38
3

Cost. Recall that Jack Tramel's mantra was "computers for the masses, not the classes". Going from 40-column to 80-column output requires faster memory (or tricks to work around slow memory), raising costs.

It appears that 80 column mode requires either a 2 MHz bus, which is what the C128 used, or a 1 MHz bus with the ability to latch and access two memory chips during one cycle, which is what later models of the PET did.

On a side note, the VIC-20 system could actually handle 40-column output. The MOS 6560/6561 "VIC" was originally designed for low-cost game and industrial systems, which might explain its unusual 22-column output. An improved 40-column version, the MOS 6562/6563 "VIC-1.5" was designed, but never released.

toejam
  • 31
  • 1
1

Many answers here refer to the limitations of domestic TV sets (some of which has to do with the CRT phosphors / shadow mask dot pitch of a particular make and model) and the luma / chroma bandwidth limitations of the TV-standards-compatible composite color video signal, RF-modulated or not. And for a good reason — the “home computers” of the era were designed to be compatible with the domestic TV set so that purchasing a separate, expensive computer video monitor would not be necessary.

“Business” computers would come with a special computer monitor, instead — often monochrome only. Or if in color, employing straightforward analog RGB or digital RGBI signaling from the video chip to the monitor to drive the CRT guns, instead of the bandwidth-limited trickery that is the composite color video signal.

It should be mentioned, though, that the Commodore 64 supports not only composite signal (both RF-modulated and baseband versions), but also, through its A/V port, separated luma and chroma — in practical terms, s-video signal, even though that standard was not established yet when the C64 hit the market. Commodore and some video monitor manufacturers called it “LCA” signal (luma, chroma, audio) and provided inputs for it. This is markedly better in quality than composite color video signal.

Still, the limited number of text columns / “wide” pixel clock on the home-oriented computers is primarily because the designers were catering for the lowest common denominator — users who would only have a TV with the typical austere set of input signal choices (usually RF only) of the late 1970s / early 1980s.

Also, when generating a TV-compatible video signal, you need to adhere to the same “safe area” principles as the TV broadcasters — that is, restrict any important parts of the picture — such as text — within the imaginary “safe area” borders to compensate for the overscanning domestic TVs. The TV-connectable home computers did this by adding large borders around the picture, but those technically waste active picture area/time/resolution in the signal that could be used for content. You generally don’t need to design your signal/screen content with the “safe area” in mind when generating signal timings for a pure computer monitor as those are usually not set to overscan.

Jukka Aho
  • 2,012
  • 10
  • 13
  • 1
    S-video could usefully show 80 columns in color if the foreground and background of each character were different shades of the same hue, but showing 80 columns would require about twice as much memory bandwidth as showing 40, thus necessitating a second memory bank for half the columns, considerably increasing system cost. – supercat Jan 20 '22 at 05:50