33

Is there a particular reason RS-232 uses 12V signals (well, sort of) rather than some (any?) other value?

It was formally defined in 1960, so that places it before the era of any standardized ICs like RTL that would have suggested lower voltages. PMOS did use 12V, but that was a decade later and I suspect it's more likely they selected 12V because RS-232 existed rather than the other way around.

There may be minor reasons for this, for instance, there were lots of 12V power supplies available due to it being a common battery voltage, and one might use a car battery as a line conditioner for instance. But that seems unlikely.

The idea that later versions sometimes drove the signals at 5V instead of 12 suggests there is a strong reason to match the internal voltages to the interface. So this leads to the possibility that the earlier tube-based computers, or at least the transistorized machines, used 12V. But looking over examples of 2nd generation machines I don't see an obvious signal to that effect.

So why 12V and not, say 5, or 6, or 24?

Maury Markowitz
  • 19,803
  • 1
  • 47
  • 138
  • 9
    Total wild guess: It came out of the telecom industry which used dry/wet-cell style battery backup for continuous operation ... 12v was easy to come by? I'll be interested to know the actual reason. – davidbak May 06 '21 at 13:46
  • 1
    The 'strong' reason to use 5V is simply to save cost. Why adding the need for a 12V PSU when short distance, like to a printer, works quite fine at 5V as well? – Raffzahn May 06 '21 at 13:51
  • 13
    Nitpicking: RS-232 doesn't say +/-12V. It in fact says max.: +/-15Volts, min: +-3V. +/-12V is just a convenient in-between that can be assumed to be available in a computer environment and also provides ample elbow room for STN (early memories used 12V as well) – tofro May 06 '21 at 13:52
  • 1
    If "min +/- 3 volts" was reliably applicable in practice, no 1980s and later small computer maker would have bothered with all the added complexity to provide +/-9V and higher levels .... but most of them did :) – rackandboneman May 07 '21 at 00:20
  • 0v and +2V would work in practice, the MC1489 was designed that way (Jim Thompson the designer said as much in a usenet post), but in the presence of noise having some headroom was a good thing. – Jasen May 07 '21 at 04:29
  • 1
    @rackandboneman Only receiver must interpret down to +/- 3V properly, the transmitter must still be able to transmit at least +/- 5V into a receiver of 3kohm load. Later equipment did not use separate supplies but used single tranceiver chip with built-in charge pump to generate the necessary +/- voltages from single 5V or 3.3V supply. – Justme May 07 '21 at 07:57
  • 1
    @Justme yes, the MAX232 and variants - which look like an amazing non-cost-cutting measure :) – rackandboneman May 07 '21 at 17:04

2 Answers2

40

There are maybe a few points left out here.

  • RS-232 was first recommended in 1960.

That's long before there were low-voltage devices. In fact, it was originally designed to work with electro-mechanical, not electronic devices, like TTY.

  • RS-232 uses directed voltages.

So it's not simply 0V and 12V but +12V and -12V. Quite useful to work with coils.

  • It does not specify ±12V.

RS-232 does not specify ±12V, but defines ±3..15V as operating range and forbids any voltages below 3V and above 25V

  • It's a voltage based interface, so more is better.

Unlike a current loop, where the burden of handling loss through resistance is handled by the sender (within reason). A current supplied in a current based interface will always reach the receiver at the same level. Voltage supplied in a voltage based interface gets diminished by distance. ​This is simply as line length is directly proportional with voltage drop. The higher the supplied voltage is (within reason), the higher the chance the voltage at the receivers end is still above the minimum threshold.

  • The receiver end may be a solenoid.

Yes, it needs to be repeated, as it's the reason why voltage drop can not be ignored. This interface was supposed to work with electro-mechanical devices, one where coils are used. In an amplifier-based detector circuit, the resistance of the detector can be made rather high, shifting voltage loss toward the detector. with a coil this is not possible the same way, so the mentioned line losses are a major concern.

  • 12V is a convenient value within the 3..15V range

12V goes on the high side of what's allowed, leaving much room to drop over the line, while at the same time leaving some room for tolerances toward higher voltages. After all, making a good and stable supply in 1960 was a much higher effort than today.

Bottom line: It's a nice ballpark value serving multiple requirements.


The 'strong' reason to use 5V in later (and local) setup simply to save cost. Why add the need for a 12V PSU when short distance, like to a printer, works quite fine at 5V as well? Even more so as modern circuitry can deliver a good high impedance, so cable losses are no longer an issue.

Toby Speight
  • 1,611
  • 14
  • 31
Raffzahn
  • 222,541
  • 22
  • 631
  • 918
  • 2
    "defines +/- 3..15V" — "forbids … below 3V and above 25V" — is this a typo or was the lower range always 3V but the upper range different criteria for 15V max vs. 25V max? – natevw May 06 '21 at 20:45
  • 13
    @natevw No typo. Two different kinds of illegal. The area below 3V (-3..+3V) is simply not a valid signal, so should not be detected as a signal. The +/-25V regulation is a security margin. While standard conform devices should not apply more than +/- 15V, they must be able to cope with up to +/- 25V without malfunctioning. – Raffzahn May 06 '21 at 22:00
  • 3
    "should not be detected as a signal"; rather, need not be reliably detected as a signal. It may be - the standard does not require the receiver to give any guarantee in this case. – poncho May 06 '21 at 22:06
  • @poncho Well, it's should not (not must not). After all, a receiver circuit wobbling in that area between 0 and 1 is a very bad idea.. – Raffzahn May 06 '21 at 22:10
  • Inexpensive and robust 12V solenoids, were very common in 1960, and the must survive +/-25V had to do with back-EMF from the operations of such solenoids. – jwdonahue May 06 '21 at 22:12
  • @jwdonahue exactly that's the point about based on electro-mechanical receivers. – Raffzahn May 06 '21 at 22:15
  • "A current supplied in a current based interface will always reach the receiver at the same level." Except that you may need to crank up the voltage heavily to reach acceptable current levels? For a voltage-based circuit, once capacitance has been filled, you will reach your input voltage at the far end. At 9600 baud, I would say that's less energy intensive? (Also, I remember Telex lines at 40V) – David Tonhofer May 06 '21 at 23:19
  • 1
    @DavidTonhofer Cranking up voltage is the whole point of a current based transmission. isn't it? It's all about delivering the current needed to operate a solenoid at the other end. Not to mention, that, at the early days, it was easy (or better less hard) to guarantee a current than a voltage. Current only needs coils, while voltage (usually) needs amplifiers - expensive high tech before the transistor. – Raffzahn May 06 '21 at 23:39
  • @jwdonahueI the back EMF of most solenoid-like devices will far exceed 25 volts, so even back then I would assume a snubber circuit and/or freewheeling diode would be used. – rackandboneman May 07 '21 at 00:18
  • 1
    True, but ~90% of the harmful energy occurs well below the peaks. It's all about thermal breakdown of the insulation on the coil wire. High current, solid state diodes probably weren't very common in 1960. Perhaps the spec was driven by the high effective response times of commodity snubber circuits? – jwdonahue May 07 '21 at 00:46
  • What are the recommended baudrates that time? I can hardly think of any electromechanical device that reliably handles pulses shorter than -let's say- 10ms, limiting baudrates to at most 110. Are there asynchronous receivers built from relays? – the busybee May 07 '21 at 06:11
  • 4
    The RS232 standard specifically says that the load must not be inductive, so coils should not be directly driven with the interface signals. Besides rated reveiver impedances are from 3k to 7k ohms so drivers are not expected to encounter loads outside that range. Driver only has to generate rated range from +/- 5V to +/- 15V when the rated load is connected, so unloaded receiver is allowed to go up to 25V. The receiver side must properly receive +/- 3V to. +/- 15V range which allows losses in wiring. The receiver must handle +/- 25V without damage, it does not need to work at those levels. – Justme May 07 '21 at 06:51
  • If the idea was to use some common and useful level, 48V would have made orders of magnitude more sense, especially in a communications setting. – Maury Markowitz May 07 '21 at 16:27
  • 1
    I disagree that RS232 had any direct connection with (electro)mechanical devices. It was specifically for interfacing a DST (modem) to a DTR (terminal), if a TTY was involved there would have been an additional interface box since these were generally current-loop devices. – Mark Morgan Lloyd May 07 '21 at 16:30
  • 2
    Re, "RS-232...defines +/- 3..15V" The current standard is RS-232-F. Before that, there was RS-232, RS-232-A, RS-232-B, C, D, and E. I think I read somewhere that the original standard required +/- 25V signals. Later revisions narrowed the minimum voltage swing down to +/- 12V, then +/- 5V, and most recently, +/- 3V. – Solomon Slow May 08 '21 at 00:29
  • 1
    Re, "...burden of handling loss..." But, RS-232 never was meant for long-haul communication. I'm not saying it never was used that way, but what it was meant for was interfacing Data Terminal Equipment (i.e., a computer or a computer terminal) to Data Circuit-terminating Equipment (i.e., a modem). The modem and the DTE were supposed to be in the same building, if not in the same room as each other. The long-haul part of the link was supposed to be from one modem to another. – Solomon Slow May 08 '21 at 00:37
  • @SolomonSlow Tell that to all sites with extensive the inhouse cabling for terminals and printers - which was quite well part of the spec. RS-232 was intended to replace (next to) all TTY based connections. – Raffzahn May 08 '21 at 02:13
  • @SolomonSlow Do you have something to back up how those voltage levels differed between different RS-232 standards? What I have gathered, all versions of the standard from from -C to -F have exactly same voltage levels defined for receivers and transmitters. – Justme May 08 '21 at 10:56
  • @SolomonSlow +/-25V was always the maximum voltage to be expected. – Raffzahn May 08 '21 at 11:17
11

The question is posed with the assumption that RS-232 was always ±12V.

Unfortunately, the short answer is, it was not always ±12V.

The specification requires that an RS-232 driver outputs a minimum of ±5V and maximum of ±15V when connected with the specified load of one RS-232 receiver. The output is allowed to have up to ±25V when no receiver is connected. The specified receiver load is also within the range of 3k to 7k ohms, so very little current is needed. The receiver must operate properly with ±3V signals to allow for noise margin, voltage drop in wires, and ground voltage difference, and the receiver must not be damaged from ±25V signals being applied.

This allows the RS-232 driver to operate with a wide range of voltages and implemented in any way that simply is within all the electrical parameters listed in the specification. Back in the 1960s when the standard was first written, the interface could have been implemented in various different ways using discrete components, instead of integrated circuits.

I suspect the misconception that it was only ±12V may come from the fact that many common devices that most people are familiar with like the IBM PC in the 80s and compatible successors did have ±12V supply voltages available, which were directly used for RS-232 transceivers when they required separate supplies.

Many devices simply used whatever supplies they happened to have available, like the TI/99 PHP1700 RS-232 Sidecar Interface used ±8V, so it definitely was not ±12V.

Later on when technology advanced and laptops and desktop motherboards started to get RS-232 ports integrated to motherboard, at some point they started to use charge-pump based RS-232 transceivers, which worked with single 5V or 3.3V supply as they generated the driver voltages internally. The charge pump based devices also did not use ±12V, perhaps something between ±7V and ±9V. The details are in the respective datasheets such as MAX202, MAX232, MAX3232 etc, or compatible clones from other manufacturers.

The RS-232 standard also specifically mentions that the RS-232 interface inputs must not be inductive, so the interface signals may not be directly used for driving electromechanical coils like relays or solenoids, and as per the specification, not much power can be drawn from a driver for driving loads anyway.

Toby Speight
  • 1,611
  • 14
  • 31
Justme
  • 31,506
  • 1
  • 73
  • 145