57

Nowadays each graphic card has some driver in operating system that translates some (typically) standard API such as OpenGL, so that programmers use some standardized API code to tell graphics cards how and what they want to render. (Actually that's already a bit hard-core most programmers really use various game engines that do this for them).

In times of old computers - how was this done? Did every programmer of every game implemented all possible various API's that old graphic cards supported? Or did the old game studios from MS-DOS times had their own "game engines" that provided some abstraction when it came to these graphic cards?

I remember there were many various card vendors and I remember old games asked me which one I have - so I suppose these games contained code / drivers for all these cards?

rchard2scout
  • 111
  • 4
Petr
  • 641
  • 1
  • 5
  • 6
  • 18
    Yep. There was no DirectX or OpenGL, so what we did is to implement algorithms on our own I still remember doing BSP and z-buffering on my own, i.e. you go there, read the algorithm spec, understand how it works and what it does and you code it. More hardcore - you don't have a spec and you come up with your own algorithm .. And then you create a full map of a frame and write it directly to the card RAM – Alma Do Jun 04 '19 at 12:14
  • 10
    Wow... this brings back memories. Most graphics cards could fall back to the CGA standard. Hercules, Tandy and the PC Jr., and EGA were other common choices. For simple games that didn't have configuration files, it would ask you for your graphics card (and later your sound card) when you started the game. Hardware detection was nonstandard or nonexistent. The VESA standards came along to try and standardize the graphics modes, but there still wasn't much support for APIs in DOS days. Even in the earlier Window and OS/2 days, the APIs left a lot to be desired in standardization. – GuitarPicker Jun 04 '19 at 12:58
  • 3
    Here's a recent take on this old problem - https://youtu.be/szhv6fwx7GY. It's about the making of "Planet X3", which is a new release old-school DOS game. – Brian H Jun 04 '19 at 21:02
  • 8
    There is a GDC talk on youtube regarding palette cycling, which was a clever and very efficient way of faking animation. Helpfully someone implemented an web version which demonstrates some of the mind-blowing results. – TemporalWolf Jun 04 '19 at 22:25
  • 2
    Bear in mind, you didn't support individual graphics cards so much as you supported a graphics standard -- like CGA, EGA, VGA, Tandy, etc. -- which was often based on a particular standard-setting card, like the IBM Color Graphics Adapter, or the chipset on a Tandy 1000. The only applications that cared about anything more than that were ones that targeted certain chipsets, like S3 or Tseng or Mach. But except for the OEMs that made their own cards, even those chips were used in many different graphics cards. Compatibility was a hardware problem back then. – SirNickity Jun 04 '19 at 23:35
  • @TemporalWolf Thank you for sharing that link. Palette cycling is amazing. – IronEagle Jun 07 '19 at 04:14

7 Answers7

78

Did every programmer of every game implemented all possible various API's that old graphic cards supported?

Yes - but it went even deeper than that. Early graphics cards had virtually no callable code associated with them at all, the concept of "drivers" had not quite become a reality yet. There was the concept of a Video BIOS, which were extensions to the INT 10h BIOS video services, that were effectively limited to initialization and switching video modes.

Instead, graphics cards, at least in DOS land, all had memory mapped display RAM, and extensive documentation was available about exactly how setting various bits in display RAM would affect the pixels that appeared on the screen. There were no drawing APIs to call, if you wanted something to appear on the screen (whether it be a pixel, a character, a line, a circle, a sprite, etc) you would write the code to move the bytes into the right places in display RAM. Entire books were written about how to write efficient code to draw graphics.

There were some systems like the Borland Graphics Interface that abstracted graphics drawing primitives into an API with different drivers that one could call to draw things on different graphics cards. However, these were typically slower than what would be required for building action type games.

An action game would typically be optimized for a particular graphics display mode on a particular card. For example, a popular display mode was VGA 640x480 with 16 colors. This would be listed in the software requirements, and you needed to have the right hardware to support the game. If you bought a VGA game but you only had an EGA card, then the game would not work at all. As you said, some games would ask what you had and you had to know what the right answer was, otherwise the game would not work.

Greg Hewgill
  • 6,999
  • 1
  • 28
  • 32
  • 16
    I wrote a few simple action games with BGI in the early 90s. It was OK for single-screen affairs with a few sprites moving around - like Space Invaders or Pac Man - but was well short of what you'd need for full screen scrolling or 3D ray-casting games. – Matthew Barber Jun 03 '19 at 23:37
  • 26
    Strictly speaking there was an API, BIOS service 10h, but you wouldn’t use it for (most) games! But there certainly was callable code associated with graphics cards (or in the case of MDA and CGA, provided by the system BIOS). – Stephen Kitt Jun 04 '19 at 06:10
  • @MatthewBarber I played with BGI, but simple sprites could even be done in QuickBasic (your example of Space Invaders reminded me, as I wrote a version in QB) – Chris H Jun 04 '19 at 08:57
  • 10
    Worth noting that sound cards were the same situation - games had to directly support AdLib/OPL2 cards, Creative C/MS cards, SoundBlaster, MT-32, etc. Each added card needed custom code written for the specific hardware in question. Printers were like this for a while also - word processors needed to write hardware code to support specific printers directly, there were no drivers or APIs, no abstraction, just bare metal. – J... Jun 04 '19 at 15:24
  • Nothing was wrong with BGI that I know of. I never could blit images fast enough but setpixel() was not the right way and I never found what the right way was because I didn't know what keywords to search for. – Joshua Jun 04 '19 at 18:27
  • remember Mode X? 320 x 200 x 256 palette? it could pan and scroll through, IIRC, 4 times the viewport size, many a great game was written and optimized for this. most commands involved MOV AL,n and MOV AH,m and then INT 10h, those calls were all over the place. – dlatikay Jun 04 '19 at 20:08
  • 2
    @J... I was under the impression that you could just dump bare characters into the parallel port and the printer would then dump them onto paper. – John Dvorak Jun 04 '19 at 20:12
  • 11
    @JohnDvorak As long as you were satisfied with only printing text, and only in the printer's built-in font... –  Jun 04 '19 at 20:33
  • 2
    I worked on Tempest certified PC graphics cards supplied to the US government. (Tempest mitigated intelligence gathered from radiated signals.) My task was to reverse-engineer the graphics subsystems of common apps of the time, (Lotus 123, Framework, Symphony, etc.) and patch them to be compatible with our cards.That was the first software reverse engineering I ever did, though I'd been hacking hardware since I was a child. There was a time I had the memory addressing and register functions of the CGA, MDA, and Hercules cards all memorized. Thank the gods that those days are past :) – Taryn Jun 04 '19 at 20:45
  • To address all the comments about the BIOS video calls, yes, the BIOS did have a video service of sorts. But it was extremely inefficient and was completely unsuitable for almost anything that one would call a "game". Perhaps its most useful service was the initial call to set up the desired graphics mode, and then the game would never call it again. Anybody who did any serious graphics work would interact directly with the display hardware. – Greg Hewgill Jun 04 '19 at 20:46
  • There were various TSR programs that would intercept changes to graphics memory for a particular kind of card and convert it to output for a different kind of card. There was a small performance hit, but simcga and (one other but the name escapes me) were my friends back when the only system I had was built with Hercules monochrome graphics and an amber-only monitor. – Perkins Jun 05 '19 at 17:44
  • 1
    @Taryn I wish I could forget a bunch of that stuff and free up my neurons for other things. Do I REALLY need to remember that the 8x8 BIOS font is stored at F000:FA6E or that the hardware VGA palette was set by sending the index to port 3c8 and the 6-bit RGB values to 3c9? – fluffy Jun 06 '19 at 06:43
  • @Greg I don’t think anyone is disputing the core of your answer, or insisting that a typical game involving graphics would rely on INT 10h services (only). I, like many others, upvoted your answer. However, your statement that “Early graphics cards had no callable code associated with them at all” is false. (While I’m at it, you could perhaps mention that some level of register programming was necessary in many cases, not just flinging pixels at a memory-mapped framebuffer or planes.) – Stephen Kitt Jun 06 '19 at 07:13
  • @fluffy LOL, I am lucky(?) that this stuff escapes me all too quickly, to the point that languages/libraries I've used my whole life still require constant reference, e.g. file I/O in stdlib :( – Taryn Jun 06 '19 at 15:28
  • @StephenKitt: All of the code necessary to configure early graphics cards was included within the motherboard BIOS. The EGA was the first card to include code to extend the BIOIS--in significant measure because some of its registers needed to be set in ways the pre-existing motherboard BIOS would know nothing about. – supercat Jun 06 '19 at 18:45
  • @StephenKitt: I guess the question would be what is meant by "associated with". An EGA or VGA card will have code stored on a ROM within the card, which will of course only be accessible on a system where the card is installed. Early display cards had no such thing. The PC BIOS can perform a sequence of register writes which will usefully configure a CGA or MDA card. Some cards may include hardware to translate writes of CGA registers into those required by the card (the AT&T 6300 PC does that, and I think some double-scan display cards do as well), but that doesn't mean... – supercat Jun 06 '19 at 19:13
  • ...the BIOS includes code 'associated with' such cards. – supercat Jun 06 '19 at 19:13
  • @supercat yes, it does all depend on the meaning of “associated with”. My take on it is that the init code in the IBM PC BIOS, and its INT 10h services, only serve a purpose with the corresponding card (or even an HGC), so the code is associated with the cards even though it lives in ROM on the motherboard. – Stephen Kitt Jun 06 '19 at 19:31
  • HGC incidentally being one example without full INT 10h support... – Stephen Kitt Jun 06 '19 at 19:45
  • Another such early graphics interface was GSX, by DRI (the CP/M guys). Drivers were available for Hercules, Tandy, CGA and EGA (and I hacked up a VGA version out of the EGA driver), along with a multitude of dot matrix printers. – Erik Knowles Aug 19 '20 at 16:03
29

Early on, you had to explicitly code your game for each graphics card you wanted to support: Hercules, CGA, Tandy, EGA, VGA. You had to know how to put the card into graphics mode and you had to know the memory layout, palette, and so on. You had to figure out how to avoid flicker and how to prevent tearing. You had to write your own line drawing and fill routines, and if you wanted 3-D, you had to know how to project it onto a 2-D screen, how to remove hidden lines and so on.

Later when graphics cards started gaining accelerated functionality, SGI created the IrisGL API, which later became OpenGL, in order to simplify CAD (computer aided drafting/design) software development for those graphics cards by providing a standard API that video hardware manufacturers could support and developers could design their software against. OpenGL provided access to underlying hardware features, and if a feature did not exist on the hardware, OpenGL provided a software implementation.

The same problem existed in games development. Originally, graphics card manufacturers would work with game studios to create a version (release) of their game that would be accelerated by the specific graphics card. Early 3D games like MechWarrior 2 had different releases for 3dfx Voodoo, S3 Virge, STB Velocity, and so on. It was a bit of a mess. At around the same time, Microsoft created the DirectX library for Windows which was similar to OpenGL. Manufacturers would support OpenGL and/or DirectX, and game studios who were brave enough to abandon DOS for Windows as a platform could then program for one or both libraries instead of creating another release for each individual graphics card they wanted to support. Any relatively minor differences between graphics cards could be handled at runtime in the same release.

snips-n-snails
  • 17,548
  • 3
  • 63
  • 120
  • 4
    Compatibility with a particular graphics card by third-party vendors was often never a 100% proposition either, particularly if programs decided to go spelunking into a card's I/O registers or use undocumented video modes (see this RC question.) Some of my early games refused to work at all in EGA modes on my blazing-fast 12MHz 286 with an ATI EGA card. – ErikF Jun 04 '19 at 04:24
  • 5
    Also before 3D cards there were so-called "Windows Accelerator" cards that made efforts made to improve 2D drawing speed, especially when Windows 3.x become mainstream. – mnem Jun 04 '19 at 06:38
  • 10
    "SGI created OpenGL to simplify game programming" is somewhat misleading. OpenGL was created as an API for professional 3D graphics, such as CAD. Only later it was adopted for video game development. – IMil Jun 04 '19 at 14:38
  • 6
    Also OpenGL came some years before before Direct3D, from an even older predecessor Iris GL. – BlackJack Jun 04 '19 at 15:26
  • @ErikF Similarly, I had an authentic CGA card that supported modes that later EGA and VGA cards with supposed backwards compatibility did not support. – snips-n-snails Jun 04 '19 at 18:11
  • 2
    @BlackJack It's not that simple - OpenGL was started before Direct3D. Microsoft was one of the companies on the OpenGL board. They eventually decided to do their own because... the board couldn't ever decide anything and wouldn't have anything ready for MS to ship with Windows 95. It's quite likely that if their hands weren't forced by MS jumping the ship (and "creating" first real competition), they'd still be debating OpenGL 1.0 to this day :D By the time OpenGL and Direct3D were shipped, there was even more competition from Glide, which OpenGL couldn't really keep up with. – Luaan Sep 05 '21 at 11:26
  • @Luaan, it's a bit more complicated than that... https://www.chrishecker.com/images/3/33/Gdmogl.pdf http://www.chrishecker.com/OpenGL/Press_Release – Arthur Kalliokoski Jan 26 '23 at 15:14
  • @ArthurKalliokoski We're already way out of the scope of "How did MS-DOS games use graphics cards", I don't think it's particularly helpful to extend this even further into the API wars :D And sure, much of this was ultimately built on clueless marketing people who were only interested in selling their product. Pointless politics too - it's not like MS didn't supply their own OpenGL drivers (better than anything else for the PC); the complaints were really less "support OpenGL because it's better" and a lot more "stop developing Direct3D, it offends me" (as much as I like Carmack's work...) – Luaan Jan 27 '23 at 06:26
25

In DOS you had direct access to the hardware; so you grabbed some good source of information about the card you wanted to support, and got down to code your routines.

A book which was often cited as a good source was "Programmer's Guide to the Ega, Vga, and Super Vga Cards", by Richard F. Ferraro; I hadn't the luck to own it or read it, but it was fondly remembered by those who did.

Another invaluable source of information was Ralph Brown's Interrupt List; you can find a HTML conversion of the list here: http://www.delorie.com/djgpp/doc/rbinter/

The original was just made of (long) text files; and, if memory serves me correctly, there were some programs to navigate it more easily, at least in the later versions.

Another nice collection of infomation was contained in the "PC Game Programmer's Encyclopedia", or PC-GPE; a HTML conversion can be found here: http://qzx.com/pc-gpe/

You had at least three different ways to interact with a given piece of hardware; io ports, interrupts, and memory mapped registers. Graphic cards used all three of them.

The situation with audio cards was very similar.

Another thing to consider is that attached to the video card was an analog CRT monitor. The older/cheaper ones were only able to sync to a given set of vertical and horizontal rates; but the newer/best ones were basically able to sync to any signal in a given range. That means that with the right parameters written to the video card registers, you could create some custom (or weird) resolutions.

Games aimed for broad compatibility, so they rarely used weird ones, while in the demoscene it was quite common (and custom resolutions were the norm in arcade games too.)

But, for example, Mode X was very popular with games!

It was popularized by Michael Abrash on the pages of Dr. Dobb's Journal; you got a 320x240 resolution, that, viewed on a 4:3 monitor, meant the pixels were square. So, for example, you could naively draw circles and they would look like circles; in 320x200 they were stretched, as the pixel aspect ratio was not 1:1, and you had to calculate and compensate for that while drawing.

It was a planar mode, so by setting a register you could decide which planes would receive a write in the memory mapped area. For example, for a quick fill operation you would set all planes, and a single byte write would affect four pixels (one for each plane). That also helped to address all the 256 KB of the VGA memory using only a 64 KB segment.

I am positive there was a little utility which let you explore the VGA registers, where you could put whatever values you fancied, and, when you applied your settings, you could finally see if your monitor supported the resulting output. But my memory is too weak right now to remember the name or the author of this program.

Another common trick was to change a part of the color palette during the horizontal retrace; done correctly, you could have more than 256 colours on screen. There was not enough time to change the whole palette on each line, so you had to be creative.

(During vertical retraces instead there was enough time to change every colour, and it was done for example for fade in/fade out effects).

(The most popular palette trick was probably changing the background color during tape loading on 8 bit machines (C64 for example).)

One thing that is often overlooked, was that the VGA card was effectively a small three channel DAC; creative people found ways to use and abuse that as well.

To a similar effect, Tempest for Eliza used the radio waves emitted by the monitor to transmit a radio signal which could be listened to with a common AM radio.

Whoa! This was a nice trip on memory lane! :)

Stephen Kitt
  • 121,835
  • 17
  • 505
  • 462
RenPic
  • 251
  • 2
  • 2
  • 1
    I think it's the most complete of the answers as of now. Consider expanding it with "scrolling" (I/O, not the slow INT 10h), as I think it's as relevant to OP's question "to tell graphics cards how and what they want to render" as the palettes. And welcome to the site :) – kubanczyk Jun 05 '19 at 09:33
  • 1
    Another well-known document compilation is vgadoc, which can be found by searching for vgadoc4b. – ninjalj Jun 08 '19 at 01:22
  • The background trick during tape load was because video memory was used to accelerate the loads :) – Michael Dorgan Aug 20 '20 at 17:39
17

In DOS world in the golden age of VGA (early to mid 90s), by far the easiest and most popular way to do graphics was the famous Mode 13h, a 320x200 pixel linear 256-color paletted video mode. The closest you would get to a standard video API was BIOS interrupt 10h, giving access to a handful of functions including switching the video mode and configuring the palette. A resolution of 320x200 (instead of 320x240 which has a 4:3 aspect ratio) was very convenient because the required video memory (64000 bytes) would fit in a single 64kB memory segment, making addressing individual pixels very straightforward. The drawback was that pixels were slightly rectangular instead of perfectly square.

The video memory was mapped to segment A0000h; in the real-mode environment of DOS you could simply form a pointer to that segment and treat it as a 64000-byte array, each byte corresponding to one pixel. Set a byte to a new value, a pixel changes color on the screen. Beyond that, implementing any higher-level drawing functionality was up to you or a 3rd-party library.

For accessing higher-resolution and/or higher color depth modes afforded by SVGA and later video cards, there was (and still is) VBE, or VESA BIOS Extensions, a standard API exposed by the video card's BIOS. But beyond switching the video mode, setting the palette (in paletted modes), and getting a frame buffer into which to plot pixels, you were still pretty much on your own.

JohannesD
  • 271
  • 1
  • 4
  • 6
    In theory the INT 10H API did include writing pixels. I believe there was even a DOS device driver available which added this API for older graphics cards that didn't support it. It was just about fast enough to animate a few hundred pixels for a graph, but wouldn't have been used by a game. – Neil Jun 05 '19 at 12:56
6

Early PC computing was the age of banging hardware directly. Your (typically assembler) code wrote directly to control registers and video memory.

As far as supporting multiple video cards, real easy: we didn't. The supported hardware was directly stated right on the spine of the game box. You needed to have that hardware. This wasn't hard; generally nobody expected games to run on an MDA (Monochrome Display Adapter, no G for graphics since it had none). That left CGA and later, EGA.


It wasn't that we didn't want APIs. I even tried to write APIs. But this was an age when you were counting processor clock cycles. You had to get it done in the cycles available.

  • You couldn't afford the cycles for the subroutine call and return!
  • Let alone all the parameter passing (moving parameters into the standard locations the subroutine wants).
  • Or the de-referencing needed for every iteration of the operation, since you have to use a memory location specified in a call parameter, instead of being able to hard-code it.

And mind you, those limitations applied to my personal API optimized for that game. If a third party entity wrote one all-singing, all-dancing API intended as the one API for all applications, then much more would be abstracted, and the above overhead would be much worse.

That's less of a problem today, because the graphics are so large and complex that API overhead is a smaller fraction. Also, there is CPU to throw at the problem; you can just let the playtesters bump the minimum CPU requirement. We couldn't do that because most of the market was 4.77MHz 8088’s.

3

Like any hardware, video card may have some address in I/O space and memory space. They are physically connected to bus (ISA bus back in 1980th). When CPU writes to some memory address, videocard answers this and accepts data. When CPU writes to some IO, same thing happens.

That means software may access it if it is aware of it's memory address and IO address.

Accessing some hardware mapped to some memory address:

MOV SOME_ADDR, AX ; Move value of AX register to address SOME_ADDRESS
OUT SOME_PORT, AX; Move value of AX to some port

Same with C

int* data = SOME_ADDR;
data[0] = '1'; //write '1' to SOME_ADDR.
outp(PORT, '1'); //to io

IBM PC compatible computers had several types of cards:

  • Monochrome Graphic Adapter

  • Hercules

  • CGA

  • EGA

  • VGA

Each card had its own standard which was documented. All of them were backward compatible (i.e. VGA may emulate CGA). VGA was the most complicated standard, there were huge books about it! Standard declares which address should you use to access video card and which data should you write to card to show something on CRT monitor.

So, first you need to find out which card do you have (you can try to read this data from memory area filled by BIOS or ask user). Then, you use standard to talk to card.

VGA, for example, had a lot of internal registers. Developers wrote some data to IO port to select register, then wrote data to this register.

Card memory was mapped, so you simply wrote data to some address (in some modes some cards had several pages of memory which you can switch).

But memory was not always plane. There was a character mode (in which each 2 bytes represent letter and it's attributes like color and bgcolor). There was 13h mode where each byte represented color of pixel. There were modes with several planars to speed up the card (see https://en.wikipedia.org/wiki/Planar_(computer_graphics) )

Video programming was not easy! Some articles to read:

There was also high level BIOS api, but it was too slow to be used by games.

You may ask: "But how do I render 3D with all of that?". The answer is: you can't.

In 80th and early 90th you had to render everything on CPU and then use video card API to show 2D image.

I really suggest you t read book about how they did it for Wolf3d:

http://fabiensanglard.net/gebbwolf3d/

The first video card that supported SOME 3D APIs was "Voodo 3DFX". It had API called "Glide".

user996142
  • 211
  • 2
  • 2
  • 2
    Hercules was bit special regarding compatibility. It was MDA->CGA->EGA->VGA which was backward compatible. Hercules was directly backwards compatible with MDA and through emulation with CGA, but I don't think that any of later cards (EGA/VGA) was able to emulate Hercules. – Artur Biesiadowski Jun 06 '19 at 14:05
0

Yes, pretty much. Those who needed to access the video beyond what the ROM and MS-DOS provided had to provide their own drivers for the devices past the standard ones.

So all those drivers or different versions of the game all sat in the directory on the floppy or HDD. There were different strategies for managing that. One was to have multiple programs and the setup or loader program would ask you questions and then either create the batch file to run to launch the appropriate main EXE. Or, it would have the various program versions backed up and would overwrite the main file. Another was to ask at the beginning and have different drivers or libraries sitting in the directory, and it would use the appropriate external files with the program.

And newer standards were reverse compatible. For instance, VGA monitors would use CGA compatibility. So the analog outputs were made to have the same intensities as the various CGA colors. Of course, there was also the MDA mode. Also, basic capabilities were defined in CMOS or jumpers, so programmers could read a "register" (maybe in the BIOS parameter block) to get a rough idea of what was attached. So mono, CGA, or unspecified (often EGA/VGA).

Often, folks did run into incompatible software. For instance, if you have an MDA monitor and not Hercules if the program used it, the code could hang, or at least the monitor would lose sync. A Herc monitor could connect to an MDA board without any problem between the 2. The monitor would use be underutilized and will not change syncs to the hires mode. So a Herc monitor could replace an MDA monitor without problem, but not the other way. And of course, if the software supports Herc and the adaptor doesn't, then the code would either bail or hang the machine, no matter what monitor is attached.

Now, in coding, some coders did come up with tests for some types of machine diversity and provided separate coding paths. An example was testing for the CPU type. Before CPU-ID functionality in the CPU, coders tested for quirks unique to each CPU. For instance, how registers and flags worked. For instance, even in a program designed to be XT compatible, it would be helpful to test for a V20 CPU, since if you knew you had a V20 installed, you could use some 186 and V20-specific opcodes and speed up parts of the code. I tested for the CPU type when I wanted to manipulate the keyboard lights because the XT and AT locations for that were different. You wouldn't want to attempt to change the keyboard lights and end up turning off the DMA refresh. That would trash memory for sure.

This sort of thing even tied in with video since the framebuffer could be in a different location than expected, so reading the BIOS parameter block would be good to do to avoid surprises. So if in doubt, ask the computer if you can, and beyond that, ask the user. And users tried things by trial and error. If many video cards were listed and they had one that wasn't listed or a generic one, they'd try all of the options until they find one that doesn't hang the machine or attempt to put the monitor into a mode it could not support.

  • 1
    There are no separate Hercules monitors. MDA and Hercules cards can use same monitors. Both cards have a text mode and only Hercules has a graphics mode. Both cards send (approximately) identical video resolution to monitor, 720 active pixels and 350 active lines, because each character is 9x14 pixels and there are 80x25 chatacters. – Justme Jan 27 '23 at 15:17
  • From what I remember, the Hercules ones were dual-sync monitors. or at least mine was. They not only operated at a sweep rate of 15750 Hz but also in the 18K range. So a card entering the 18K mode could not sync on a monitor that only could do 15.75 KHz. I remember reading the specs on one. It was neat that there were TSR programs like SimCGA that would enable Herc mode for CGA graphics programs. – Purple Lady Feb 08 '23 at 09:18
  • 1
    If so, it must have been some generic or multi-sync monitor. MDA and Hercules modes were fixed to one specific mode with 18kHz horizontal rate. They did not use 15.75 kHz, as that is a CGA rate. With SimCGA, the Hercules card is in Hercules mode, it does not change to 15 kHz. – Justme Feb 08 '23 at 09:59
  • Nah, I was thinking the other way. SimCGA just remapped everything. I never said it changed anything sync-related.

    And I thought that MDA was modeled after a TV set. 15.75 KHz was the standard TV rate and applied to B/W sets too.

    And I don't remember if I could ever hear the horizontal sweep for MDA. That would have been a test.

    – Purple Lady Feb 08 '23 at 10:09
  • 1
    Also, there had to be a reason why there was the 40-column BIOS option. I'd think that would be if your monitor did sync at the television sweep rate since 80 columns would be difficult to render at that rate. – Purple Lady Feb 08 '23 at 10:21
  • Nevermind. Everywhere I've looked says that MDA used 18.432 KHz, and likely used a 16.257MHz pixel clock. They may have planned to include 15.75 KHz at some point, but never used it. There was a bit you could toggle for a low-res MDA, but since another clock crystal was not included, it would only hang things. It was 9x14, but if the low res mode that wasn't used was selected, you'd only have 8 pixels high, but it would not work since the 2nd clock was missing on those boards.

    So, thanks, you taught me something.

    – Purple Lady Feb 08 '23 at 10:49