22

I was under the impression that "interlaced" video modes was only a thing for remote television content because it saved bandwidth to only send 50% of the data to the homes, so you could fit more TV channels in the same bandwidth (air or cable).

I thought that all video games, being local to the customer's home with a video cable between the console and the TV, used "progressive" mode, meaning "not interlaced", since there was a dedicated video cable between the two units. Why would it have to send "half the data" in that scenario?

But then I watched a video on YouTube where it was casually mentioned that some games used "480i" instead of "480p", without further explaining it. Apparently, this was done very late, so it was not some kind of early 1980 video game console technique either.

Why would any game console or individual game send an "interlaced" picture to the TV when it's right there above the console? I assume that the picture must suffer in some subtle way from this "only every other line of image data", but it must be far more noticeable for video game content compared to a movie or TV show.

Bucky o'Hare
  • 229
  • 2
  • 3
  • 20
    TV had no such thing as progressive mode. – R.. GitHub STOP HELPING ICE Jul 16 '21 at 21:19
  • 5
    @R..GitHubSTOPHELPINGICE TV had no such thing as interlace mode either, as it has no modes to begin with. It will just show what is sent to it, and you can send progressive and interlaced video signals to TV. – Justme Jul 16 '21 at 22:30
  • 16
    @Justme: It had exactly one mode (for a particular TV standard) which is scanned in an interlaced manner. There is no non-interlaced scan pattern that's compatible with the video standards TV used (NTSC or PAL). – R.. GitHub STOP HELPING ICE Jul 17 '21 at 02:00
  • 3
    @R..GitHubSTOPHELPINGICE No, TV's don't have only one mode, they can lock on to any reasonable video signal as proven here. There are broadcast TV standards, and then there are also gaming devices and home computers which don't need to care much about broadcast TV standards as long as customers have picture on screen. Even if a progressive video standard does not exist, it did not stop gaming devices and home computers such as C64 or NES from outputting progressive signal, and it did not stop TVs from displaying progressive signal. – Justme Jul 17 '21 at 07:54
  • maybe worth noting in this context: there were PC games that had an explicit interlaced mode. To improve performance, they would halve the vertical resolution by drawing only the odd lines, leaving the even lines black! (Or vice versa.) It wasn't pretty. I remember Archimedean Dynasty having this option, but I could not find any screenshots. – kubi Jul 17 '21 at 11:33
  • 1
    @Justme: There is a specification and what particular models might be able to do. The latter is irrelevant. If you are making a consumer product, its output must meet the specification. The "progressive signal" you're talking about from some game consoles is not a progressive signal. It's just an interlaced signal derived from half the number of lines of source material. – R.. GitHub STOP HELPING ICE Jul 17 '21 at 11:51
  • 1
    @Justme I don't see that your link "proves" anything, except the trivial fact that you can build an interface between any two different standards if you want to. – alephzero Jul 17 '21 at 13:21
  • 4
    @R..GitHubSTOPHELPINGICE If you claim that, please provide evidence to back your claim up. I have provided the evidence for my claims in my answer, please read those links. Consumer products such as C64 and NES and IBM CGA card do not output video according to any standard or specification of interlaced signal, they output progressive signal, without interlacing. – Justme Jul 17 '21 at 14:06
  • 2
    @alephzero The point is that people claim consoles output interlaced signal and TVs require interlaced 480i signal, and the truth is, consoles made in 80s and 90s output a compatible non-interlaced 240p signal, and TVs will work with that non-interlaced 240p signal. – Justme Jul 17 '21 at 14:08
  • 2
    @Justme: By "a compatible non-interlaced 240p signal" do you mean one where the relative phase of vblank to hblank is fixed for each vertical refresh, rather than cycling +- 1/2 line time each, but otherwise in-spec? – R.. GitHub STOP HELPING ICE Jul 17 '21 at 14:26
  • 2
    @R..GitHubSTOPHELPINGICE Yes, a 240p signal where vsync and hsync have same alignment for all vertical refreshes, which means integer amount of 262 lines per vertical refresh. Which is in contrast to the completely standard interlaced 480i signal where vsync happens always after every 262 plus a half line, to draw the odd and even fields. – Justme Jul 17 '21 at 14:44
  • 1
    @Justme: Some devices which output 227.5 chroma clocks per line (e.g. the VIC-20) are designed to have chroma phase that alternates every field, making it necessary to use an odd number of scan lines, while others which use the same phase every field (something I've observed on many karaoke players) need to output an even number. There are thus three usefully different forms of non-interlaced video, as well as a number of variations that may have more or fewer lines to make electronic design easier. – supercat Jul 19 '21 at 15:17
  • 1
    @Justme: The common video chip used with the CDP-1802 (the 1861 I think) had a line period of 224 reference clocks, which would result in a slightly faster-than-standard horizontal rate unless used with a crystal that's slightly slower than a standard colorburst. My Stella's Stocking game uses 264 scan lines to allow the audio logic which runs once every four scan lines to stay synchronized with the frame. I suspect that color machines which output non-interlaced video all use 227.5 by 262, 227.5 by 263, or 228 by 262ish, and those with interlaced use 227.5 by 262.5. – supercat Jul 19 '21 at 15:22
  • 1
    @supercat Good point - not all devices use 262 lines, and inverting the subcarrier phase for each field does remove artifacts. Quite a bit of tolerance is OK in the sync timing parameters, the subcarrier is more important. E.g. the C64 has two NTSC chip versions, One uses 224 clocks per line (15.98 kHz) with 262 lines (60.99 Hz), The other uses 227.5 clocks per line (15.73 kHz) with 263 lines (59.82 Hz) and most likely it looks better. Both use the 14.31818 MHz crystal. NES uses 227 and 1/3 cycles per line for 261 lines (60.3 Hz). IBM CGA is 228 clocks with 262 lines. Lots of variation here. – Justme Jul 19 '21 at 16:18
  • 1
    @Justme: NES is extra weird because at its normal line rate, every frame would be be 1/3 of a chroma clock shorter than the nearest chroma multiple, but the last line of every other vblank is shortened by 1/3 chroma clock, making the resulting frame 1/3 chroma longer than the nearest chroma multiple. I have no idea why Nintendo did that rather than using 227.5 by 263 format? – supercat Jul 19 '21 at 16:53
  • @Justme: BTW, my C64's original VIC-II chip would have been the CPU 64 cycles/line variant, and the replacement I bought becuase I didn't like the semi-broken sprites on the original was the 65/263 variant. While I replaced the video chip to fix the sprites, doing so also greatly improved the color quality. – supercat Nov 26 '22 at 18:15

6 Answers6

40

Indeed, early devices such as C64, NES, or IBM PC with CGA adapter did not use interlacing, but simply sent 240p to the TV. And later devices such as the Amiga could send either 480i or 240p.

But TVs were not 480p capable, only 480i or 240p. So it was not possible to use 480p.

For example, Amiga 500 can send either interlaced 480i for hi-res graphics and progressive 240p for low-res graphics, and the game is free to select the mode anytime. This can be used for example show a static title screen in 480i for high resolution and the moving game action in 240p for lower resolution without flickering. The reason why switch to 240p, instead of sending 240p content as two 480i fields, is to avoid the 240-line content from flickering, as it would look like as if the same 240-line content is jumping up and down by half a line for each 60 Hz field.

While early gaming devices used progressive 240p signal only, later on, some games on newer gaming devices used interlaced signal to display even the game in interlaced 480-line resolution.

Then some background:

TVs can only accept a video signal with single horizontal frequency of about 15.734 kHz (or 15.625 kHz).

Standard TV programs are interlaced 480i (or 576i) to have high screen update rate of 60 Hz (or 50 Hz) to prevent flickering and to achieve a high resolution of 480 lines for static images. But it also means that the image is really sent as two 240-line (or 288-line) fields at 60 (50) times per second, with half a line offset, so basically a full frame of both interlaced fields is sent at 30 (25) Hz.

Gaming consoles are free to send any kind of signal they want but the TV must be able to lock onto it. So gaming consoles must also use same horizontal rate and vertical field rate to send the video.

It means that the progressive signal that is comparable to 480i is not 480p, but 240p. There is no flickering as only one field of video is sent so the video lines drawn are on top of each other. The difference is that if game screen has 240 lines but sends it as interlaced 480i, there can be some flicker seen due to the same lines being drawn as interlaced, the picture seems to jump up and down by one line of 480i (half line of 240p).

(I used 480 and 60 Hz since that's what the question asked. For differences between 60 Hz and 50 Hz video, replace 480 with 576 and 240 with 288 and 60 Hz with 50 Hz).

To rectify some misconceptions that seem to be very common:

  • Signals sent to TV do not need to be interlaced
  • Using RF modulator does not mean signal must be interlaced
  • There is a difference between interlaced signal and progressive signal formats, regardless of what content is sent
  • Interlacing means sending 480-line content as 240 even lines in one field and then the remaining 240 odd lines between them in another field, with field rate of 60 Hz. Signal has 525 lines divided into two fields of 262.5 lines. A full frame is thus drawn at 30 Hz, but that usually not important.
  • Progressive signal does not mean sending 240-line content twice, as two interlaced fields that are offsetted by half a line in respect to each other. This is of course possible, but it's still interlaced signal, and this would look flickery.
  • Progressive signal means sending integer amount of lines per field, so that there is no half-line offset, so lines of each field are drawn on top of each other without half-line offset, and there is nothing drawn between these fields like in an interlaced signal.
  • Devices like Commodore 64, NES, and IBM CGA card all send out progressive signal of 262 lines per field.
  • Devices like Amiga or PlayStation 1 can switch between sending 240p and 480i signals, i.e. 262 lines or 262.5 lines per field.
  • Link proving progressive 240p signal exists and devices use it
  • Link proving NES video timing is progressive
  • Link proving C64 video timing is progressive
Justme
  • 31,506
  • 1
  • 73
  • 145
  • To give an example: Amiga games by Team 17 (Superfrog, Alien Breed, Project X) always began with the 'TEAM 17' purple logo interlaced, before using non-interfaced graphics (but still interlaced RF video!) for the gameplay. (original AB has a non-interlaced logo, but AB:SE has an interlaced one) – knol Jul 16 '21 at 06:30
  • What's the difference between an odd/even frame in PAL/NTSC terms? Different timings for vsync? – Dan Jul 16 '21 at 08:15
  • 1
    @DanSheppard Yes, one of the frames start at the same time as a line starts (vsync happens at same time than hsync happens) and the other frame starts exactly halfway between two lines (vsync happens halfway between two hsyncs). – Justme Jul 16 '21 at 09:55
  • They sent the 240p image as a 480 interlaced video, though, because TVs in those days could only handle interlaced signals at full resolution. – fishinear Jul 16 '21 at 17:34
  • 1
    Related : How does Atari 2600 TIA display multiple resolutions in Asteroids?. Interlacing buys you a whole extra scan line of compute time when you're racing the beam. – J... Jul 16 '21 at 18:22
  • 1
    @fishinear Once again, they send a 240-line image as single non-interlaced 240p field, and no they don't send a 240-line image as two interlaced 480i video fields, as that would cause the same 240 lines jumping up and down by half a line due to the interlacing. If you don't take my word how devices like C64 or NES generate video progressively in 240p, how about reading this blog entry from a respective source – Justme Jul 16 '21 at 20:29
  • @Justme You are confirming what I am saying. They can either send it with 240 resolution, non-interlaced, but then they don't use the full resolution of the TV: you see a pattern of black lines on the screen, like we all know from the old IBM PCs. Or you send the image interlaced at 480 resolution. You cannot set full resolution non-interlaced, the TV cannot handle that. – fishinear Jul 17 '21 at 09:33
  • 1
    @fishinear You specifically said "They sent the 240p image as a 480 interlaced video" which is not true because 240p image is sent as non-interlaced 240p video, that is the key difference between interlaced and progressive video, that there either is a half-line-offset or no half-line-offset between fields. – Justme Jul 17 '21 at 09:38
  • 1
    @Justme We are both saying the same thing, but you confuse the matter with statements like "Signals sent to TV do not need to be interlaced". The signals MUST be interlaced if you want to use the full resolution, and avoid the black line pattern. Unlike modern Tvs, old TVs are build for interlacing, they cannot handle full resolution any other way. The question of the OP is about why 480p was not used. The straight and simple answer to that is: the old TVs could not handle that. By circling around that issue you are not clarifying things. – fishinear Jul 17 '21 at 09:45
  • 2
    I would argue the difference between 480i and 240p is that 480i alternates between odd and even scan lines, so the image shifts up and down a pixel each refresh. Proper 240p adjusts the signal so it renders to the same scan lines every refresh. Improper 240p is just 480i, and the image flickers up and down every refresh. – MichaelS Jul 17 '21 at 09:50
  • 1
    @fishinear You are confusing two things youself, the interlacing vs progressive, and 240-line and 480-line resolutions. I am definitely not claiming you can get full 480 line resolution without interlacing as I know it is impossible. I am specifically claiming you can get 240 lines progressive without interlacing, and 480 lines interlaced with interlacing, as they use identical bandwidth. Old TVs accept 240 lines progressive signal just fine, they do not need interlaced signal. Please read the links in my answer to see how old consoles send 240p without interlacing, if you don't believe me. – Justme Jul 17 '21 at 09:53
  • 2
    @Justme "Old TVs accept 240 lines progressive signal just fine" - they DON'T accept 240 lines "just fine": you get a black line pattern; the used lines are individually clearly visible, because the intermediate ("interlace") lines are not used. I would not call that "just fine". – fishinear Jul 17 '21 at 09:58
  • 2
    @fishinear But that's how early consoles in 80s and 90s worked with TVs. And it is true that there are more distinct scanlines due to the signal was not interlaced but used progressive scanning. So the TVs are able to lock on and display the non-interlaced signal, even if it is does not look "just fine" in your opinion. To most of the people it was just fine, as that's they way it was. Back then it was even part of the experience, if they even noticed it. And looking at the retro scene today, people are actually still seeking the original scanline experience via emulation. – Justme Jul 17 '21 at 14:17
  • How is 240p different from 480i where every other field is all black? – jamesdlin Jul 17 '21 at 19:44
  • 3
    @jamesdlin That is a good guestion to understand the signal. The 240p signal draws a full progressive frame of 240 lines at a rate of 60 Hz. The 480i signal draws each half of the lines per field, either the field with 240 odd lines or the field with 240 even lines, at a field rate of 60 Hz. Therefore, if you only use say the even field for content and the odd field is all black, you update the content every other field, or at a rate of 30 Hz. Though scanline-wise, the content will look like progressive 240p, but with half the frame rate and it would likely flicker quite a lot. – Justme Jul 17 '21 at 20:42
  • @MichaelS: Many early LCD screens were limited to updating one row of LCD pixels per scan line. If one where using a program that showed half of the objects on each field, alternating, such screens would often draw one set of objects a half line above or below the other set, chosen arbitrarily. Some such screens would periodically swap which set was above or below the other a few times per second, while others would assign an upper and lower field and stay with that selection. A few LCDs would end up only updating half the scan lines, while the others... – supercat Jul 19 '21 at 17:20
  • ...might continue to display "NO SIGNAL" or whatever had been shown before the display started showing the non-interlaced signal. – supercat Jul 19 '21 at 17:21
17

TL;DR:

It's not about what a console can deliver as it is what a TV can display. Classic (pre digital) TV sets could only receive and display interlaced frames. So no sense in producing a non interlaced one.

Similar console developers would have been unwise to create consoles and content that could not be displayed on what Joe Aerage had as TV. Sales might not have been geat.

(In addition, therms like 480i or 480p are only defined in hindsight from a digital POV, and have only a restricted meaning when it comes to analogue TV)


In Detail

I was under the impression that "interlaced" video modes was only a thing for remote television content because it saved bandwidth to only send 50% of the data to the homes, so you could fit more TV channels in the same bandwidth (air or cable).

No. Well, not directly. Of course, everything in transmission is always about bandwidth, as that's the shared resource. But it wouldn't have been a big deal to double channel width in the early days.

Interlaced 50(60) Hz delivers 25(30) pictures full pictures per second. In theory more than enough for moving pictures s cinema does fine with 24 pictures per second. Except, TV works complete different from cinema. A cinema frame is displayed on the screen once and in al regions (almost) at the same time, but in TV a dot paints the picture line by line on the screen.

Doing this without any additional measure would result in a quite flickery picture, despite a frame rate equal or better than cinema.

The chain of thoughts looked a bit like this:

  • First stept: Make the screen disperse the beam energy over time (aka afterglow), so lit parts stay longer lit.

  • It should not stay lit too long, as that would make any motion blurr

  • With 25 (30) fps this would still result in an uneven, badly lit picture with a noticible fade (*1).

This is, BTW, the similar what makes pure 24 fps cinema flickery, like known from old silents. To reduce the inter frame flicker each frame later got displayed two or three times (*2) before moving to the next.

  • Drawing a revived picture twice would have worked as great on TV, except a CRT got no storage, so it had to be send more often than 25(30) times per second

  • Thus doubling the transmitted frame rate was the way to go.

  • Doubling allowed to reduce the persistence allowing to reduce blur

  • Doubling improved fast motion as well

So far a nice solution. A picture with 200-300 horizontal lines is quite fine for TV and would satisfy all needs. Producing TV content at 50(60) Hz frames would have been no issue. But then there were movies. They were produced on 24 Hz, so transmitting every picture twice would work quite as good as genuine TV production, but waste about half the bandwidth.

Adding interlace allowed to increase resolution without increasing bandwidth.

Thanks to the wonders of human sight movies now used the 50(60) fps to display their 24 fps in higher resolution, while genuine TV content got real 50 (60) fps in kind o an enhanced resolution.

I thought that all video games, being local to the customer's home with a video cable between the console and the TV, used "progressive" mode, meaning "not interlaced", since there was a dedicated video cable between the two units. Why would it have to send "half the data" in that scenario?

Because that's what the TV set expects as input and is able to display. A (classic, CRT) TV set does not combine two 'half' pictures into a single one to be displayed at double fps, but displays each 'half' on it's own, the second offset by a line. It's the way a CRT hardware works.

But then I watched a video on YouTube where it was casually mentioned that some games used "480i" instead of "480p", without further explaining it. Apparently, this was done very late, so it was not some kind of early 1980 video game console technique either.

Because games need to be compatible to the TV hardware (and standards) out in the field. It's not a great idea to produce a console that is can only work with the very latest displays - at least not if one intends to sell more than a few.

I assume that the picture must suffer in some subtle way from this "only every other line of image data", but it must be far more noticeable for video game content compared to a movie or TV show.

Not really, rather the other way around, as a console game is able to adjust. Keep in mind, 480i or 480p does only tell part of the story, as that number only names a frame size and structure, not the frame rate. Interlaced will usually create 50 (60) fps, while the same as progressive can be 25 (30) or 50(60) fps.

A classic TV set expects 50(60) 'half'-frames per second, so a console can use this as

  • 50 frames per second with 'half' vertical resolution, or
  • 25 frames per second with 'full' vertical resolution.

So any game that is about fast action, like a race game or a (complex) shooter will create each frame separate and get a smooth 50(60) fps with at cost of some resolution (288/240 horizonal). In contrast a game that is more about beautiful pictures, or lots of content may go ahead and do 25(30) 'full' frames using full 576 (480) horizontal lines (*3).

Oh, and of course games could go ahead and do fast content with high resolution - with the same drawback as TV content had with motion artefacts.


*1 - Yes, that is in fact related to the wandering dark bar that is visible when an unsynchronized camera films a CRT TV.

*2 - Or more exact, the douser that covers the film movement between frames also cuts one (two) more times within a frame, transposing the flicker to 48 or 72 Hz, where it gets way less noticible.

*3 - In reality this wasn't uses as often due memory limitations to generate and hold pictures that large.

Raffzahn
  • 222,541
  • 22
  • 631
  • 918
6

CRT TVs were designed to handle interlaced signals, where the TV alternates by receiving odd scanlines and even scanlines on alternating frames. The so-called progressive mode was invented when some hardware designer noticed you could start the even and odd frames on the same scanline, so the CRT's electron gun overwrites the previous frame's lines instead of drawing between them. This doubles the frame rate and halves the vertical resolution, both of which were useful for game machines (since there usually wasn't storage space for high-res graphics). This is also why old game consoles exhibit the scanline effect, where there are little gaps between rows of pixels.

scanline effect

Interlaced signals don't have this problem because those gaps get filled on the subsequent frame, at the expense of making the image more jittery (an effect known as interline twitter), which is usually smoothed out by applying a blur, but video games didn't generate such blur. So that's another reason games used progressive scan.

Stephen Kitt
  • 121,835
  • 17
  • 505
  • 462
Kef Schecter
  • 221
  • 2
  • 3
  • 1
    Otherwise this answer is correct, but technically, you can't send consecutively only odd frames, or only even frames, because the oddness or evenness would already mean that the signal has an interlaced signal structure where one field ends and other field starts at a half line boundary. To make both fields identical, each field is shortened by half a line and that is now both fields are identical progressive fields with equal integer amount of lines per field. – Justme Jul 17 '21 at 17:50
  • I don't think the fact that scan lines on consecutive fields pass directly above each other was so much a design goal, but rather a design consequence of not including the circuitry necessary to produce an interlaced signal. – supercat Jul 19 '21 at 16:57
  • 1
    @supercat The SNES had an interlaced mode that was almost never used, so there were definitely reasons beyond circuitry that game consoles used progressive scan. – Kef Schecter Jul 19 '21 at 20:23
  • @KefSchecter: If displaying non-interlaced video were easier than displaying interlaced, many game consoles would have used the latter (because it was easier) even if the former would have looked slightly better. As it is, however, non-interlaced video is much easier to display in a manner that will look reasonable on most monitors. Among other things, when showing non-interlaced video, imperfect horizontal/vertical sync waveforms may cause some horizontal distortion near the top of the screen but this generally won't be too bad if it's identical on every field. – supercat Jul 19 '21 at 21:56
  • @KefSchecter: Many monitors may lose horizontal sync when fed crudely generated composite sync waveforms, but then lock onto it during the vertical blank interval. Often the sync lock won't be perfect, but if it's the same every frame having a little waviness in horizontal positioning near the top of the frame won't be too much of a problem. If an interlaced waveform is likewise defective, however, even and odd frames would likely be distorted noticeably differently. – supercat Jul 19 '21 at 22:02
5

To be exact: "Interlacing" is not just a method for bandwidth saving, but mainly for increasing vertical resolution.

EDIT: Bandwidth saving and increasing resolution are just the different sides of the same coin, see a comment by Justme.

E.g.: The TV screen (in Europe / 50 Hz) was divided into 625 lines. The first picture frame contains the odd lines (1, 3, 5, ...) from the upper left corner, the second picture frame contains the even lines (2, 4, 6, ...), started from the top middle of the screen. Two frames give one full image, so the European TV has 50 frames per second or 25 full images per second.

Video games, as well as home computers, did not use interlacing, they generated only "odd" picture frames, not respecting neither interlacing nor shifting. Old TVs were more robust and tolerant to the video signal, so they can handle such "non-standard" signals with no big problems. So video games have got 50 pictures per second, but only 312 possible lines to display. In fact, less, because some of them are outside the viewable area.

The main reason to use interlacing in a video game could be increasing the vertical resolution up to 600 (approx.) lines. On the other side, interlacing has some drawbacks, as image flickering. You have to use more subtle video generating circuits, handling with the even/odd frame distinction, etc.

N.B.: The USA and Japan and some other countries used 525 interlaced lines (262.5 with no interlace) and near 60 Hz frame rate (59.94 Hz in fact). Those 525 lines were not fully visible, only the 480 lines were visible, so "480i". In fact, there was no "480p" mode in the 80s', so technically it WAS 480i, but used as "240p"...

Martin Maly
  • 5,535
  • 18
  • 45
  • Actually the interlacing really is a method for bandwidth saving. Because a certain vertical resolution is needed (amount of lines) to have an acceptable resolution, and certain refresh rate is needed to have acceptable temporal resolution (frame rate high enough not to be blurry, and to reduce screen flicker due to CRT phosphors), the interlacing actually was the method to reach acceptable temporal and vertical resolution and yet have acceptable bandwidth. But I guess you could say that bandwidth saving or increased horizontal resolution are just the different sides of the same coin. – Justme Jul 16 '21 at 06:39
  • 2
    If at all, it's for vertical resolution as it provides more lines in alternating pics. – Raffzahn Jul 16 '21 at 08:40
  • 1
    Well, there's also a third side to the coin. You could also say it's for having double the refresh rate with using the same amount of bandwidth and lines, as all these parameters are linked together. But technically, it really is for conserving bandwidth - if bandwidth had not been a limiting factor, they could have just used double the horizontal rate to double the bandwidth and send progressive scan, resulting into more expensive equipment. These specs are cost vs performance tradeoffs done when TV systems were being developed, in 1930s. An analog TV channels already uses 6 to 8 MHz. – Justme Jul 16 '21 at 10:40
  • @chux-ReinstateMonica I am sorry, my mistake... – Martin Maly Jul 17 '21 at 12:16
  • @Justme: Even if one wanted to use twice the bandwidth, that would then introduce the question of whether to use 60 non-interlaced fields per second with 480 visible lines each, 60 interlaced fields of 480 visible lines per second, interlaced (yielding 960 lines of vertical resolution), 120 fields per second of 240 lines each, staggered by 0.4 lines/frame (so every 24Hz group of five fields would show 1280 visible lines), but scanning would hit or pass within each position 48 times per second), etc. Many options. – supercat Jul 27 '22 at 15:09
4

Two reasons come to mind:

  1. The image was sent to the display (the TV) using an RF modulator. This essentially acts as a low power TV station. Since the TV expects broadcast channels to be interlaced, the signal sent from the RF modulator must be interlaced as well.

  2. You alluded to this in your question: Reduced bandwidth. By only displaying every other line, the number of pixels that need to be drawn per frame is cut in half. Instead of displaying a 640×480 image 50 or 60 times per second, one can instead display a 640×240 image at the same rate.

Alex Hajnal
  • 9,350
  • 4
  • 37
  • 55
  • Well, for point #1, wasn't RF modulators only for NES and earlier? AKA very early consoles. – Bucky o'Hare Jul 16 '21 at 06:00
  • If Wikipedia is to be believed it was up until the 4th-gen consoles (e.g. SNES). FWIW, the last machine I used that had an RF modulator was an Atari 1040STFM (released 1986). Technically I later (ca. 2006) used a Propeller demo board (pic, schematic) that does the modulation in software, but anyway... – Alex Hajnal Jul 16 '21 at 06:20
  • 1
  • Does not apply. Interlaced or progressive has nothing to do with RF modulation. For example C64 will only output progressive video, and Amiga video output can be progressive and these can be sent over RF modulation just fine. 2) Does not apply. TVs were not able to show 640x480 images 50 or 60 times per second, due to the interlacing there is 240 lines sent at 50 or 60 times per second for half a picture and thus a full frame of two 240 line fields are sent 25 or 30 times per second.
  • – Justme Jul 16 '21 at 06:22
  • 1
    @AlexHajnal We had an old TV and bought the optional RF modulator about the size of a USB phone charger for our Nintendo 64 (which just slotted into the corresponding hollow around the AV Multi Out port, so I can confirm they were available as OEM modules at least that late. (As opposed to generic Radio Shack boxes that would take any composite signal and modulate it to channel 3 or 4.) – ssokolow Jul 16 '21 at 07:15
  • 2
    It doesn't matter if it's RF-modulated or not; CVBS or even S-video have the same format restrictions. – hobbs Jul 16 '21 at 13:41
  • @Justme you are actually not contradicting the answer. In order to use the full resolution of TV, you need to send the signal interlaced, because that is how the TV's of those days work. What you call "progressive" is simply sending the same image twice, so the normal frame and interlaced frames are the same. It is still interlaced video, as far as the TV is concerned, though. – fishinear Jul 16 '21 at 17:17
  • @Justme 2) does apply as well: the computer only needs to draw half the number of scanlines in each frame. That reduces the capacity of the GPU required. – fishinear Jul 16 '21 at 17:20
  • 1
    @fishinear There is a difference between sending same 240 lines as interlaced signal, and sending the same 240 lines as progressive signal. 1) There is no requirement to send only interlaced signal over RF modulator, otherwise your C64 or NES would not work with your TV over RF modulator, as they do not send interlaced signals to begin with. 2) Interlaced or progressive, there are only 240 lines per field that is drawn at the rate of 60 Hz. So you can't draw 480 lines at 60Hz. From a 480 line source, you can only draw 240 odd or 240 even lines in a single field and fields are sent at 60Hz. – Justme Jul 16 '21 at 20:01
  • Justme, fishinear: IMO, you're both right. My answer's a bit simplified (seemed appropriate to the question). For the full details on 480i vs. 240p refer to this question. – Alex Hajnal Jul 16 '21 at 20:29