I've heard the term 'DMA' being used a lot in reference to older consoles. My very basic understanding is that it's a way for the console to access memory directly. But directly as apposed to what? isn't RAM always accessed directly?
-
3This is off-topic because it’s not specifically about retro-computing, but briefly put, DMA (direct memory access) allows peripherals to access memory directly. See Wikipedia for details. – Stephen Kitt Oct 24 '19 at 13:46
-
why is this not about retro computers? I had no idea dma is still used. I was asking specifically about retro usage of dma. That's the only place I've seen it mentioned – Badasahog Oct 24 '19 at 13:47
-
2@JackKasbrack DMA is the BASE for any modern system. As stephen says, there is nothing specific retro about - at least not more as with the terms RAM, CPU or Keyboard. – Raffzahn Oct 24 '19 at 13:51
-
You're not confusing it with DMA Design games developer company are you by any chance? :) – Smock Oct 24 '19 at 14:37
-
8I guess the term was thrown around much more in retro terms because back then its implementation was worth shouting about — e.g. the Atari ST using DMA for its floppy disk is a big step forwards from most of the previous home computers which required the CPU to go into a busy poll loop, to the exclusion of all other work. Nowadays it's unimaginable that media access would block all OS activity. – Tommy Oct 24 '19 at 14:45
-
2@Smock nope, not a chance – Badasahog Oct 24 '19 at 14:54
-
1@JackKasbrack Most modern computers use DMA so extensively it's just implicitly assumed to be in use. For example, each lane on a PCIe controller can do DMA independently, and that is how nearly all IO on a modern PC or server is implemented. – RETRAC Oct 24 '19 at 21:41
-
4After some thought, I've decided I feel this question is on-topic for RCSE because you need to know the answer to know that DMA is not just a retrocomputing thing. I've made a meta post for detailed discussion of this, and voted to re-open. – cjs Oct 25 '19 at 06:54
-
@Tommy - Many if not most pre-microprocessors used DMA. Offhand, I think every computer I used before a PC had DMA. Because a CPU is too expensive to waste on menial tasks. – dave Feb 08 '20 at 00:45
-
When it was a new marketable term it was used to market/differentiate some products, but now like DRAM and ECC/EDAC and other things that came with time, its not a marketable differentiator so you wont see it on a poster or box or web page necessarily. Also understand that many DMA implementations do not free up the cpu, they simply can move data faster and more efficiently so you choose to have the cpu stall while the dma happens. only some designs are such that stuff can happen in parallel (then and now). – old_timer Feb 13 '20 at 13:02
-
The comment by @Tommy is good context. The Intel MCS-85 family, of which the CPU component is the 8085 (1976) processor, contained two DMA controllers, the 8037 and 8057. A total guess is that pre-8085 Intel systems like the Altair did not have DMA. The Apple II did have DMA but not sure about the Apple I and other 1977 trilogy members, the Commodore PET and TRS-80. – Single Malt Jun 13 '21 at 07:01
3 Answers
"The console" (or other computer) is made up of various parts, including the processor, the memory, and peripherals such as video display controllers, I/O chips to read from a keyboard or joystick, disk controllers, and so on.
When using "programmed I/O," the CPU reads data from or writes data to a peripheral and, if the data need to be stored in memory, the CPU does that as well. So the CPU might read a byte from the disk controller, store that byte into a buffer in memory, and repeat that to read an entire sector from disk.
DMA allows the CPU to be paused and/or disconnected from the memory bus so that peripherals can directly read data from or write data to the memory without the involvement of the CPU. Thus, instead of the example given above, a disk controller could write its bytes directly to memory while the CPU waits, which is faster because rather than "read byte then write byte" by the CPU there's just "write byte" by the peripheral.
- 25,592
- 2
- 79
- 179
-
Nice and simple +1 but I just need to add ... while copying by CPU there must be also some instructions used to copy so the CPU is accessing also program memory to read the instructions while copying lowering the speed much much more ... here nice example of the difference: Breakeven number of bytes for programmable DMA – Spektre Oct 27 '19 at 08:07
-
while the CPU waits – Or while the CPU is running cycles that don't need the memory bus! – forest Mar 07 '21 at 09:10
-
@Spektre Actually, it depends on the form of DMA. It was not unusual for microcomputer video systems (such as the Apple II and the Commodore 64) to read video display data directly from the main system RAM, which was continuous and automatic, requiring no programming or setup except to do things like changing the display mode. – cjs Mar 07 '21 at 09:37
(Too long for a comment)
DMA is not in any way console or retro computing related. It is actual technology and today used more than ever. Each of your disk reads and writes, each transfer to your graphics or network card will usually involve DMA.
DMA describes the access of components, other than the CPU, on (a CPU's) memory. In general it's the ability to run a multi master bus system. Further information can be found at the corresponding Wiki page and next to endless other resources on the web.
- 222,541
- 22
- 631
- 918
DMA is a technique where you can move memory around without involving the main CPU. This was a huge win for old gaming hardware as an awful lot of time can be spent simply shuffling graphics data around the screen memory. Offloading this task to custom silicon meant the CPU had more time to do much more interesting and computationally expensive things. The actual copying itself can be done much more quickly too, meaning smoother scrolling etc.
- 516
- 3
- 4
-
depends on the implementation, many dma implementations, the cpu had to stall and wait for the dma to complete, it wasnt free, in those implementations it was more of an efficiency thing, the cpu instructions could move one or a few things at a time, where dma could be used to move chunks at bus speeds. only sometimes it freed up the cpu to do stuff in parallel – old_timer Feb 13 '20 at 12:58