28

I've just finished reading Charles Petzold's book, Code: The Hidden Language of Computer Hardware and Software. In it Charles explains building relays into gates, gates into logic components, and logic components into computing machines.

He talks about the book TTL Data Book for Design Engineers which first came out in 1973. (I'm assuming people had lists of TTL gate chips prior to this). This talks a lot about the Texas Instruments 7400 series of logic gate chips.

Then he talks about the Intel 4004 in 1971, and the 8008 in 1972, the 8080 in 1974 and the Motorola 6800 in 1974.

Could there have been a homebrew computing club in 1971? Could people have wired together logic gates to build a CPU prior to the 8008 coming out?

(I'm assuming this is people outside the Intel design office who are prototyping the next CPU).

My question is: Were people building CPUs out of TTL logic prior to the 4004, 8080 and the 6800?

Edit: Assumption: By "people" I mean any hobbyist or engineer who is not explicitly prototyping the next CPU to be manufactured by Intel or Motorola.

scruss
  • 21,585
  • 1
  • 45
  • 113
hawkeye
  • 2,575
  • 2
  • 13
  • 23
  • 9
    Side note: if you're a developer and you've never read the book mentioned, you owe it to yourself to do so! – Matt Lacey Jan 04 '17 at 04:25
  • 2
    I wish I could find this great photo that was an article header in Byte ... I think ... it showed a TTL version of a 6809 (possibly a 6801 or even a 6502) that was produced by the engineers designing the chip for testing the thing ... an incredible number of small breadboards (with a few TTL chips each) wired/held together with patch cables ... and the whole thing (which would have been about 10 feet square) just spilling off the lab table ... – davidbak Jan 04 '17 at 06:15
  • 5
    I worked at Honeywell Marine Systems Division, West Covina, CA, part time while in school, 1976-1978 - they were building a sonar for the Navy - can't remember the model number - but it contained a pretty capable minicomputer built entirely out of dozens of 3"x4" modules of TTL logic on a relatively large backplane. They didn't call it a computer though - they called it a controller. Reason was that the Navy permitted only certain milspec computers to be used ... for whatever reason Honeywell didn't want to use one of those ... so they did a homebrew ... but couldn't call it "computer" ... – davidbak Jan 04 '17 at 06:20
  • 3
    This may be a bit OT but several people TODAY are building CPUs out of TTL logic. I even have a design for one though I've never found the time to actually do it (instead I built them virtually in a simulator: yes, one gate at a time - thank god for copy-paste). Google for the "homebrew CPU webring". Why build them instead of buying CPUs? Some of us want to design instruction sets instead of computers. – slebetman Jan 04 '17 at 08:33
  • What year would that Byte magazine have been @davidbak ? – hawkeye Jan 04 '17 at 12:49
  • 2
    People were building computers out of TTL logic prior to the microcomputer, if you include people who worked for universities, IBM, etc. By "people" do you mean "hobbyists?" – Wayne Conrad Jan 05 '17 at 05:27
  • Yes - any hobbyist or engineer who is not explicitly prototyping the next CPU to be manufactured by Intel or Motorola. – hawkeye Jan 05 '17 at 10:33
  • 6
    Not an answer, but related: A 6502 built from discrete transistors: http://monster6502.com – tofro Jan 05 '17 at 15:15
  • It fits "engineer", but I think it exceeds TTL logic - the Apollo space capsule guidance computers (AGCs) from the early 1960s were made with NOR gates; it sounds like the gates were technically integrated chips with a few transistors each. – nsandersen Jan 06 '17 at 17:48
  • @nsandersen Those were RTL (resistor-transistor logic) gates, which is even more basic than TTL. https://en.wikipedia.org/wiki/Apollo_Guidance_Computer#Design – JAB Mar 06 '17 at 21:19
  • 1
    Have a look at https://eater.net/8bit/ for a modern take on the subject – cup Feb 17 '18 at 10:45
  • @davidbak Probably not what you're thinking of, but 8 Bit Breadboard CPU is a modern-day attempt at something similar. – TripeHound Aug 05 '19 at 08:24
  • I remember reading about people buying old pinball machines etc. to get the electromechanical relays to make "thinking machines". Think more along the lines of Nim or a half adder, not a "computer" or "cpu". – Arthur Kalliokoski Jan 26 '21 at 03:08
  • 1
    “People” are still doing this. :-) The gigatron is a current day kit computer with a TTL processor. – WimC Jan 27 '21 at 13:03
  • @davidbak - As recent as 1995 Intel used to build CPU simulators using ordinary logic to prove their designs. Because it would run at a fraction of the final device clock speed ISTR that it would take many days to boot Windows for example. – Kevin White Jan 28 '21 at 19:00
  • https://github.com/GilDev/Kenbak-1-Replica. Someone on github made a kenbak 1 emulator:) – 6502Assembly4NESgames Feb 04 '21 at 21:04

14 Answers14

44

It was very common to build CPUs out of TTL logic prior to the 4004, 8080 and the 6800. This was the standard way to build later minicomputers. Examples are the Data General NOVA, Xerox Alto and TI-990. Also, if a company needed a processor for, say, a CNC machine or a video game (Vectorbeam), it wasn't unusual for them to build a unique processor from TTL.

One interesting case is the Datapoint 2200, a desktop computer from 1970. It was built from TTL parts, originally using a serial processor and shift-register memory. To improve performance, Datapoint asked Intel if they could build a single-chip version of the Datapoint's processor. Intel agreed, eventually building the 8008 processor, which duplicated the Datapoint 2200's architecture and instruction set. (Among other things, this is why x86 is little-endian.) Texas Instruments also built a microprocessor with the same instruction set for Datapoint (the TMX 1795), beating Intel, but ended up abandoning the microprocessor. Datapoint decided to use a faster TTL processor based on the 74181 instead of using the 8008, so Intel marketed the 8008 as a product, essentially creating the microprocessor industry.

To answer your other question about why homebrew computers weren't a thing in 1971, it was basically a matter of cost and complexity. Building a computer from TTL was much too expensive for the typical hobbyist, and much more difficult than building a computer from a microprocessor chip. As someone else mentioned, memory was also a huge problem. Magnetic core memory was the only cost-effective approach, but it was expensive and required complex interfacing. Semiconductor memory was insanely expensive at the time: Intel's first RAM chip came out in 1969 and provided 64 bits of storage for $99.50.

To summarize, CPUs were commonly built out of TTL logic, but these were minicomputers, not hobbyist computers. It wasn't until cheap memory and microprocessor chips were available in the mid 1970s that homebrew computers took off.

Ken Shirriff
  • 1,133
  • 11
  • 10
  • 3
    This is a very informative answer. Thanks for sharing! – wizzwizz4 Jan 04 '19 at 22:09
  • 2
    Great answer! One question "Intel agreed, eventually building the 8008" - how does the 4004 relate to this part of the story? Isn't the 8008 just the follow on from the 4004? – hawkeye Jan 05 '19 at 00:46
  • 6
    While the 8008 is often described as a follow on from the 4004, they are in fact totally different processors. Many of the same people worked on the 8008 after designing the 4004, but otherwise there is no connection between them. – Ken Shirriff Jan 05 '19 at 00:57
  • 1
    Include the VAX 11/780 in the list. – Stefan Skoglund Feb 27 '20 at 12:40
26

Here is an homebrew / educational computer made of LSI / MSI chips :

http://www.kenbak-1.net/index.htm

Designed in 1971

256 bytes of memory made of MOS shift registers.

Grabul
  • 3,637
  • 16
  • 18
  • 11
    Welcome to Retrocomputing! This is a borderline link-only answer. It would help future visitors if you edited your post to elaborate on the KENBAK-1 Computer. – JAL Jan 04 '17 at 21:42
  • 1
    That's awesome! Can you add an explanation of what it is capable of? – hawkeye Jan 05 '17 at 00:46
  • 2
    That's absolutely brilliant! I didn't realize so large delay line shift registers were available so early. That's a real computer, made from TTL chips. The complete schematics is in the Theory of Operation document from your link. kenbak-1.net/index_files/Theory%20of%20K-1.pdf . What a treat. Must investigate further. +1 from me. Cheers! – PkP Jan 05 '17 at 07:01
  • Any idea what other sizes of delay shift register were available, either static or dynamic? I would think it would be possible to double the speed of the machine if the program counter were stored in a 128-bit shift register, and bytes were stored in interleaved fashion. If the the longer shifter held address 000[octal], 010, 020, to 170, then 001, 011, 021, up to 171, etc. then instead of each instruction fetch requiring a full lap through the long shift register, eight fetches could be handled on each lap. – supercat Jan 05 '17 at 22:09
  • I agree with @JAL. I'd be happy to upvote your answer if you provided more detail. – CJ Dennis Apr 27 '18 at 01:57
  • Details are available in the original creator own website. It's fair to provide a link for those really interested, instead of copying content here. – Grabul Mar 02 '20 at 22:30
22

Hmm, an interesting question to be sure. It certainly would have been possible to make something like a 4004 style microprocessor from TTL chips. In fact, when Intel made their microprocessor, the first in the world, they chose not to pursue a patent for it, because they felt that there was no invention there; it was obvious for someone to go and combine the workings of a processor from several ICs into a single IC.

That said, it would have been rather pointless for the home enthusiast to make a processor, because before the microprocessor there also were little or no semiconductor memories that he could program with the code that the processor runs. It's rather an unfulfilling excercise to make a stored program computer when you don't have a memory to store the program into. [Edit]: semiconductor memories were becoming available just prior to the microprocessor, and seems there was a time period when this was actually happening, please see the answer regarding the KENBAK computer!

Instead, people were using TTL chips much like programmers are using programming languages today. If you're making a controller for your model railroad, you'd pick a latch here and a flip-flop there. You'd pick up a monostable multivibrator instead of calling Delay(). The hardware design approach worked very well and people were blissfully unaware of the problems and complexities that would later come with software projects.

Building computers was a large industry in the 1960's. But the processor was not the main design problem in those computers, it was the interface to memory. It's rather more difficult to use something like a steel drum memory, even if it couples with a standard 1/2 inch shaft 6000 RPM electric motor, than, say, an EPROM.

Along with the 4004 microprocessor, Intel made also the companion chips 4001, 4002 and 4003: a 2Kbit ROM, a 320 bit RAM, and a 10 bit shift register that works as an interface to the outside world. Without the 4001, 4002 and 4003, the 4004 would have been as useless as the home TTL processor. More info: http://www.intel.com/Assets/PDF/Manual/msc4.pdf

Interestingly, the 4001, 4002, 4003 and 4004 all were the brainchilds of a single man, a young design engineer at Intel, Federico Faggin. He was the lead engineer on all four and, even with one failed production run, he managed to make all the four chips in just 9 months. So I think those chips should be called his brain quadruplets!

Faggin is an interesting man, to be sure. Even younger, before coming to Intel, he also invented the self aligned silicon gate: the single most important IC manufacturing key technology after photolithography. He's still going strong and his interview from 2014 is one of the more interesting views in YouTube: https://www.youtube.com/watch?v=hugZii_eX30

PkP
  • 959
  • 1
  • 6
  • 9
  • 1
    Welcome to Retrocomputing. This is a good answer; thanks for not just explaining whether it was done but why, then how the 4004 was created. – wizzwizz4 Jan 04 '17 at 08:06
  • 1
    Could you explain why memory is a problem? Can't you just build a large array of TTL chips into a makeshift addressable RAM? – hawkeye Jan 04 '17 at 12:47
  • 1
    @hawkeye, Hi! well, yes and no. My data book library is in storage, but I'll make a note to dig up a data book from 1970 or 1971 to check what was available. You could perhaps get a register of 8 bits in TTL (74S373) in 1970. But the fan-out of those TTL chips was not too great, meaning that you couldn't connect too many of them together. Perhaps 10 or so. So you could build a RAM of 8 or 16 bytes this way, and people did this, but anything larger would cause design issues of their own: extra amplifiers, darlington drivers, stuff like that to be able to drive a lot of ICs acting in parallel. – PkP Jan 04 '17 at 20:03
  • 1
    This is a great explanation. Could you add a note to your answer about amplifiers and Darlington drivers and other things that impact the scalability of simple TTL solutions to the home brew CPU approach? I'd like to read up on them. – hawkeye Jan 05 '17 at 00:48
  • 1
    @PkP: Did fan-out limitations apply to three-state or open-collector outputs which were not used as inputs? I would have expected that while inputs would source current by design, the outputs of a disabled 74373 should behave as an open circuit. Also, I'm curious if you know to what extent any notable designs made use of what I call "resistor priority logic"--have an input connected directly to a 3-state output and also connected to some other output via resistor, so that the 3-state output will control the state of the line when enabled, but the resistor will control the state otherwise. – supercat Jan 06 '17 at 23:13
  • 1
    To tie two of the threads together: Faggin subsequently worked on the 8080, then left Intel to found Zilog, whose main product was an improved 8080, the Z80. Intel's lawyers actively looked into it but felt that the only actionable intellectual property incursion was assembly mnemonics, being a form of creative writing. That's how strongly they believed the no-invention-here conclusion on the technical side of things. – Tommy Feb 15 '18 at 14:47
  • 1
    @PkP - not a 74-series, but Intel's 3101 was a 64-bit TTL SRAM (16x4 organisation), available in 1969, as was the 1101 which was 256-bit, but PMOS rather than TTL (I don't have a source for its organisation, but it's in a DIP16 package, so allowing for VDD and GND pins plus CS and WE there are only 12 pins to play with, so 32x8 is out of the question, therefore I'd guess it was a 64x4 chip). Along with an 8-way multiplexer chip to run the chip select lines, it should therefore have been reasonably simple to go as far as 128 words. – Jules Feb 15 '18 at 17:27
  • @hawkeye: There were no technical reasons stopping one from building a large RAM out of TTL. (Fan-out is not a problem since you can insert buffers: each output can drive e.g. 10 buffers, which you can repeat as needed.) The big problem was that TTL RAM was way too expensive for hobbyists. The Intel 3101 RAM chip cost $99.50 in 1969 ($700 in current dollars) for 64 bits of storage. Exponentially falling RAM prices made the Altair's 4 KB memory card affordable ($264) to the hobbyist in 1975. – Ken Shirriff Jan 08 '19 at 17:17
  • The TI SN7489 64-bit memory was available by 1973 (that datasheet was published Dec. 1972), though I don't know how much it cost. – cjs Sep 18 '19 at 10:02
17

The Xerox Alto

A 1972 machine, officially introduced in March 1973. It had multi-chip CPU built around the 74181 IC.

The 74181 represents an evolutionary step between the CPUs of the 1960s, which were constructed using discrete logic gates, and today's single-chip CPUs or microprocessors. - Wikipedia

The Xerox Alto had a mouse-driven GUI (yep, a real bitmap graphics display), Ethernet (yep, this was the computer they had to invent Ethernet for, to allow to easily put them into a LAN), featured WYSIWYG (they specifically invented the laser printer for that feature) and an object-oriented operating system/development environment. Not too shabby for a personal computer (the first Personal Computer, actually) with a CPU built from discrete ICs.

Note that the Xerox Alto was not the first with a CPU designed around the 74181, it was just the most influential.

Before the 74181 was available, most CPUs would span several PCBs. With the 74181, a CPU could fit onto a single (large) PCB.

Links:

wizzwizz4
  • 18,543
  • 10
  • 78
  • 144
Klaws
  • 521
  • 3
  • 5
  • 1
    It's also worth noting that while the Alto was actually a relatively slow computer (0.3 MIPS, which is probably roughly comparable to a 1MHz 6502 e.g. Apple II / Commodore PET) this isn't because its CPU was that slow: the CPU was slowed down to 5.8MHz in order to run synchronously with its RAM. Without that limit: a look at the schematic suggests that the limiting critical path would be decode time on its microcode prom (50ns) + about 5 74S series gate delays (typically 3ns each) + either ALU delay (30ns) or register select delay (35ns), so I suspect it could have run at ~10MHz. – Jules Feb 15 '18 at 16:51
  • 3
    I'm glad to see the Alto mentioned here. I'll just add that the Alto's CPU spanned three circuit boards: an ALU board, a control board, and a control RAM (i.e. microcode) board. – Ken Shirriff Jan 04 '19 at 20:07
15

There was also the EDUC-8 (pronounced "educate"). It was Australia's first hobby computer, published as a series of articles in Electronics Australia from August 1974 to August 1975. The computer uses 100 discrete logic integrated circuits. There are 98 74 TTL series or 9000 series TTL chips, and two 1Kx1 static RAM chips. The computer runs at 500 kHz, with a 10 kHz instruction rate, due to bit-serial implementation to reduce complexity. There is an option of 128 or 256 bytes of RAM. Instructions are manually loaded using front panel switches. There are two serial input ports and two serial output ports at the back of the computer. These can be connected to a keypad, octal display, paper tape loader, paper tape puncher, printer, keyboard, music player, teleprinter, magnetic tape recorder and alphanumeric display. The 8-bit instructions are based on a shortened version of the 12-bit PDP-8 minicomputer.

The computer uses a custom power supply, supplying 6A at 5V (30W total) to the computer and peripherals. The computer uses 3A (15W) with the peripherals using up to 3A. The main computer is housed in a 103 x 293 x 357 mm case. There are eight single-sided printed circuit boards consisting of the Front Panel Board, Connector Board, Timing Board, Decoder Board, Accumulator Board, Program Counter and Adder Board, Memory Board, and IOT Interface Board.

You can see pictures and read more about it at http://www.sworld.com.au/steven/educ-8/

Alan

Alan Laughton
  • 341
  • 2
  • 4
11

It would have been cost and space-prohibitive to try to build a microcomputer using TTL logic chips, but minicomputers and mainframes were routinely built from such chips (or related technologies like DTL, ECL, etc.). Processors like the 4004, 8008, 8080 and 6800 were not powerful enough for minicomputer and mainframe workloads, so they used simpler (and faster) chips to get the job done.

Eventually minicomputers and mainframes were able to fit the entire CPU onto a single chip, but until then it was common for CPUs to occupy one or several circuit boards (or a cabinet full of circuit boards). Large chips like the AMD 2900-series allowed processors to be reduced from a cabinet down to just a few boards.

Ken Gober
  • 11,427
  • 1
  • 40
  • 56
  • Thanks Ken. Can you help me understand why the distinction between microcomputer and minicomputer was important when building a machine without a CPU on a chip? – hawkeye Jan 04 '17 at 12:51
  • 1
    The distinction is primarily about size and cost. A microcomputer could be expected to use a 4-bit (4004) or 8-bit word size (8008, 8080, 6800), with a fairly small amount of memory (as little as 1KB) and few or no peripherals. A minicomputer would be expected to have between a 9-bit to 18-bit word size, a minimum of 4KB memory (preferably 8KB or more) and at least a few peripherals (teletype, paper tape reader/punch, etc.). Just for the CPU logic by itself, a 16-bit mini would use 4 times as many TTL chips as a 4-bit micro, making it 4 times the size and 4 times the cost. – Ken Gober Jan 04 '17 at 14:14
  • Wow! Minicomputers used a 9 bit word size? What was the non-CPU chip a mini computer used that was easier than combining TTL chips? – hawkeye Jan 05 '17 at 00:51
  • 2
    I don't know of any that actually had a 9-bit word size, it's just a boundary to separate the larger-word-size minis from the smaller-word-size micros. I know that 12-bit, 16-bit and 18-bit word sizes were common, and there was a 10-bit computer at Lincoln Laboratory. Minicomputers did eventually use TTL chips, but because of the larger word size the number of chips needed was higher, and as a result the size and cost of the computer were higher as well. Before TTL they used discrete transistors and diodes, not because it was easier but because TTL wasn't available in convenient packaging. – Ken Gober Jan 05 '17 at 13:36
  • @KenGober: Were there any things that could reasonably be called "microcomputers" that required more than a half-dozen chips for the CPU, or any "minicomputers" that did the job with less than a dozen? I think that would be the main distinction between microcomputers and minicomputers. – supercat Jan 10 '17 at 02:57
  • 1
    At that time, you couldn't really make a CPU with fewer than several dozen chips unless you were using MSI chips that implemented larger features like ALUs, register files, ROMs, etc. in single chips. When LSI chips became available, it was possible to build an entire minicomputer CPU in one chip (or a few), and the distinction between minis and micros shifted away from CPU size and towards things like word size, address space, and I/O capability. Modern mainframes are now the size of old minis, and modern minis are now called "servers" and use CPUs nearly identical to the ones in micros. – Ken Gober Jan 10 '17 at 04:41
  • Minicomputer and microcomputer are fundamentally marketing terms, but a standard definition is that a microcomputer is built from a microprocessor, which is a CPU on a chip. The above definition of a minicomputer as > 8 bits doesn't make sense; for example, the SPC-12 minicomputer (1968) was an 8-bit machine while the PACE microprocessor (1974) was 16 bits. – Ken Shirriff Jan 04 '19 at 22:18
  • The definition that was in the books I grew up with was that they were defined by price points for a base system at time of release, with microcomputers costing no more than thousands of dollars, minicomputers costing tens of thousands, mainframes costing hundreds of thousands, and supercomputers costing millions and up. – ssokolow Mar 02 '20 at 04:52
9

Well, there was an obscure computer called the VAX 11/780 built out of TTL.

This was of course not a hobbyist computer, but it doesn't seem to be ruled out by the wording of your question - it was certainly built by 'engineers'.

On the amateur front, here's a 1975 newsletter from the (UK) Amateur Computer Club, containing details of a machine called the Weeny Bitter, constructed with 7400-series TTL. I was vaguely considering building it at the time, but never did.

dave
  • 35,301
  • 3
  • 80
  • 160
7

The 74181 ALU was available in 1970, so hobbyists could build something with it -- it was $16.50 in quantities of 100. http://apollo181.wixsite.com/apollo181/about

Here's a 4-bit computer built around the 74181 and TTL logic: http://jaromir.xf.cz/fourbit/fourbit.html

That "thing" doesn't have to be a full CPU though. For example, Midway's 1975 arcade game Wheels II uses the 74181 as a glorified counter which was sequenced by a ROM, but still not technically a CPU -- since it was limited to a few addition operations and no branch instructions, hardwired to the needs of the game.

8bitworkshop
  • 536
  • 6
  • 4
  • 1
    Right. The 74181 is a very versatile component. You can build a CPU with it, but if you already have a CPU available you can use it as a maths coprocessor to handle stuff that's too hard for your CPU. If you can push data to it fast enough, you can get up to 50MHz out of it (or 30MHz if you're using many of them along with a 74182 to produce a 16-bit wide ALU), one operation per cycle. It was to 70s CPUs roughly comparable to GPUs nowadays. – Jules Feb 15 '18 at 17:01
  • In the early days there were a lot of computers which were not Turing-complete. But they were still useful for a lot of things. The defining feature of a CPU would be that it was a somehow "centralized" processor (even if it was huge) which could do many sorts of things. – Klaws Feb 22 '18 at 13:42
7

The German hobby-electronics magazine "ELEKTOR" had a project called "Computer 74", named both because of the year of publishing and the TTL logic chip family that made up most of the design. Sorry, I don't remember more about it...

Ralf Kleberhoff
  • 2,380
  • 10
  • 12
6

You ask about "people" -- I'll answer about myself.

In 1968, I build a "bit slice" ALU using transistors and diodes. I only built two bits of it, but it had an accumulator, two operand latches, and the usual add, subtract, shift, and rotate instructions. It took something like 20 transistors, 30 diodes, and 50 resistors. It was more like DTL or RTL than TTL, since there were no active pullups on the outputs. I didn't have tools fast enough to measure the speed.

In 1970, I shopped a proposal for a minicomputer around to several existing minicomputer companies. IBM invited me to visit Yorktown Heights -- not because they wanted the computer, but because they thought it was interesting that a high-school kid was doing this. BTW, at this point I had scored a 32K core memory bank with drivers and sense amps from RCA, so that was my plan for memory.

In 1971, when I arrived at MIT, a company in Cambridge supported my pastime by buying parts. By then, it was an electronic design, and used the 74181/74182 (of course). I wrote some code for it, and found that it was far to slow with the instruction set I had -- for instance, it didn't have a hardware multiply instruction -- so I dove into micro-programmed architectures.

Untimately, no computer resulted. When summer came, I asked them for financial support, which they seemed surprised about and functionally declined. I stopped working on the computer, and started working at the MIT AI lab to further develop text editors.

By the time I get back into processors, microcontrollers were common and plentiful, but there was still a lot of TTL required to implement the graphics and peripheral device connections. Using and a Texas Instruments TMS-9900 CPU I build a fully asynchronous system with 16K of RAM, 1 MB of magnetic drum storage, audio cassette interface, glass TTY, a keyboard, a mechanical printer, paper tape punch, optical paper tape reader, and a fully hardware debugging console. My roommate and I wrote an assembler and editor, so we were self-hosted and fully bootstrapped. Those were the days.

cmm
  • 747
  • 6
  • 10
5

And to all the other fine answers above, let me add the RCA COSMOS, later better known as the RCA 1802.

This was a relatively advanced CPU in 8-bit terms, with 16 16-bit registers (!!) but an 8-bit ALU and instruction set. A number of people, notably Tom Pittman (Tiny BASIC) claimed it to be their favorite CPU design and noted that it had very tight code, often 15% shorter than the 8080.

It was originally implemented specifically as a home computer starting in 1970 using TTL chips from RCA's labs, along with switches and lamps from Radio Shack. It filled a box about the size of an Altair and, over time, added TV output and cassette storage.

At the time, RCA was investing heavily in CMOS, which was not a common process at the time. Unfortunately, this was also the period when RCA was being run by Sarnoff's son, who was far more interested in hobnobbing with Hollywood glam stars in his attempt to turn RCA into a media company.

Eventually, they decided to build the CPU in CMOS, originally as the two-chip 1800/1801 in 1974, and then the single-chip 1802 the next year. This was the chip that went into early RCA products like the ELF and Studio II. But by that time the 6502 was crushing it in the market. However, RCA also made a silicon-on-sapphire version, which saw use in military and space roles for years.

Maury Markowitz
  • 19,803
  • 1
  • 47
  • 138
4

Memory may not have been quite so much of an issue as people were making it out to be. Keep in mind that the base version of the Altair 8800 (kit price $439) shipped with a "1024 word" (by which they mean byte) memory board populated with only 256 bytes of RAM. If you were willing to work with less, and in particular design your computer to use RAM more efficiently than the 8080 did, you could build a working system that could run basic, small programs. (I wonder how many of the original Altair owners actually toggled in programs much larger than a few dozen bytes, anyway?)

Random Access Memory

One example of such a design is given by Hilary D Jones in "Building A Computer from Scratch" (BYTE magazine Vol 2 No 11, Nov. 1977). It describes a small but complete system built from entirely from TTL parts costing about $65 at the time.

The memory used TI SN7489 64-bit RAM chips (offering 16×4-bit registers); these were available by 1973. The configuration described uses eight of these, giving 64 bytes of memory (the full address range of the computer).

All rest of the parts appear to be standard older ones (e.g., the SN74181 ALU slice) that were available by 1973 as well, except probably the 74188 PROMs used for the microcontrol instruction store. That could have been worked around by instead hardwiring the instruction logic. That would probably not have been so difficult, given that the computer had only four instructions!

And this limited instruction set shows how appropriate design can save on RAM. Unlike the 8080 (and even the 4004) which used multi-byte instructions, all instructions for this CPU are single-byte:

00nnnnnn    WIO N   Wait for input to location N, displaying
                    current contents of N while waiting.
01nnnnnn    ADD N   Add data in location N to accumulator.
10nnnnnn    STN N   Store negative of accumulator in location N.
11nnnnnn    JGE N   Jump to location N if accumulator is greater
                    than or equal to zero.

Writing programs with this instruction set took some cleverness, but it sure is compact!

Shift Registers

Another popular form of memory in the early 1970s was shift registers, which offered more capacity than early random-access memory at a cheaper price per bit. The original Type 1 Datapoint 2200, introduced in 1970, is an example of a computer that used them for main RAM (2 KB base, expandable to 8 KB), along with a CPU made from around 100 TTL components.

An example of homebrew use of shift registers was the Apple 1 video display, which used six 2504 1 kilobit shift registers to store the 6-bit character codes for each location on a 40×24 screen (along with a seventh to store the cursor position). Footnote 1 of that article provides early pricing information: $11.05 (qty. 100) in 1970.

RAM, however, was soon to fall far below shift registers in price. By the end of 1975 you could find both in the back pages of Byte, with a Signetics 2533 1024-bit shift register (second-sourced by AMD) going for $7.95 (qty. 1), as compared to a 2102 1K×1 static RAM going for $2.95 (qty. 1).

Difficulty

Even at the time, CPUs seemed to be considered by some to be a bit of a black art. Jones says that it's not really so difficult:

...the design of a computer from the ground plane up is still generally regarded as an art that only the foolhardy would unkertake. In reality, though, the job is not nearly as mysterious as it seems. For proof I offer the fact that when I begain this project I had no design experience with TTL (or experience with any form of electronics design for that matter).

cjs
  • 25,592
  • 2
  • 79
  • 179
  • Any idea what how much shift registers would have cost? I can imagine a practical computer design using an 8-bit shift register as an accumulator, a 64x2 shift register for short-term storage and one or more 1024x1 shift registers for long-term storage, with a function to transfer blocks of 64 bits between short-term and long-term storage. – supercat Jan 26 '21 at 19:10
  • @supercat I've updated the answer with a whole section on shift registers, including pricing. If you can dig up schematics for the original version of the Datapoint 2200 that will probably provide an example of a practical design using shift registers for main memory. – cjs Jan 05 '23 at 00:44
  • Thanks for the info on pricing. I'm surprised that 1Kx1 static RAMs had so quickly become so much cheaper than 1Kx1 shift registers. I'd guess shift registers were easier to produce in simpler processes, but the massive capacitance of all of the parallel MOSFET gates meant that designs based on cloning many copies of a shift register stage became increasingly impractical. – supercat Jan 05 '23 at 03:19
  • I still find it a bit surprising, though, that chips which acted like shift registers didn't remain popular, since they could fit in smaller packages than RAM chips. – supercat Jan 05 '23 at 03:20
  • @supercat Smaller packages, sure, but they were slower for many useful forms of memory access and more complex (vastly more complex as memory increased beyond a kiloword or three). Even for something like a video buffer, which is scanned out sequentially, it's really nice to have random access as well, as you'll see if you try to port a video game to the Apple 1. – cjs Jan 05 '23 at 04:15
  • Agreed, RAM has many advantages, but with regard to the Apple 1, I've been curious what would be possible with a few simple hardware tweaks to let the CPU write data into the shift register at arbitrary times. – supercat Jan 05 '23 at 15:04
  • I would think that if one added a 74LS373 to an Apple I to allow code to read the scan line counter, and bypassed a lot of the circuitry that sits between the processor and the shifters, it would be possible to write a routine that would write all 40 bytes of an arbitrarily-chosen line of text in a single frame. From what I understand, the existing design shifts through all of the display data multiple times per frame, so it would be possible to have a piece of code which waited until a moment four cycles before the line woudl start to be shifted, then... – supercat Jan 05 '23 at 19:10
  • ...used a "LDA zpbuff/STA SHIFTER/LDA zpbuff+1/STA SHIFTER" sequence to write columns 0, 7, 14, etc. of that line, then waited until a moment three cycles before the line would be shifted, used the above sequence to write columns 1, 8, 15, etc. and then after doing that five more times would have written the entire line. – supercat Jan 05 '23 at 19:12
4

Perq computers, which were built first in Pittsburgh, PA, and then later, somewhere in England, had a CPU that was mostly built from 74LS parts. "Mostly," because it had one LSI chip, an Am2910 microsequencer.

The first Perqs were shipped in 1980--well after the Apple][ and other microprocessor-based computers had hit the market. In fact, the Perq's I/O system used a Z-80 as an auxiliary processor.

Solomon Slow
  • 2,101
  • 12
  • 21
3

To add to the above, there was a book that I had as a teenager, published in 1966

We built our own computers by A.B.Bolt et al. ISBN 9780521093781.

You can view some of the pages here https://books.google.co.uk/books/about/We_built_our_own_computers.html?id=aQ84AAAAIAAJ

Yes, people did wire up components to make computers prior to 1971. These are more purpose built computers like the BOMBE. They show you how to join up bits of circuitry and work out the logic - more or less what you do in Verilog or VHDL on an FPGA nowadays but this was with physical wires and it was done by schoolkids: part of the School Maths Project in the UK. There is one that plays noughts and crosses in the book.

A.B.Bolt was the supervisor. The book was actually written by sixth-formers.

I didn't build anything from the book as I didn't have access to many of the components. What it did teach me was binary logic and how it was manipulated. For a 12 year old who'd only ever done BODMAS, manipulating binary logic was absolutely fascinating. It was also the first time I'd learnt something completely new from a book.

cup
  • 2,525
  • 1
  • 9
  • 21