The tick rate of old console games

Started by
5 comments, last by Anthony Serrano 7 years, 2 months ago
Hi all, since I'm not sure if my question belongs in specific sectiin, I'll ask here. Does anyone know the common tick rate of old console games (amiga64, psx). I wonder because recently I watched several longplays (where people play games until it's finished), and noticed that some of the games were rendered @ 60fps. Holy molly. I thought amiga64 < psx in term of processing power? Also, some of the early car sim game (gran turrismo anyone)? And some dude's name rally games were available back then at psx, which was amazing simulating all those at 30(felt like) hz. Also does the console (specifically psx) dictates that the tick is hardwired to 30hz? I know that this is not the case with ps4, but since I never saw any psx games with >30fps render rate, I'm not sure.
Advertisement

They did less, but they operated at the same speed as the TV scan lines. For NTSC screens that means roughly 60 frames per second. Going even farther, that rate was chosen because the power grid sent energy at 60 Hz, and being synchronized with it reduced image flickering.

Quite a few of the games from the 1970s used an individual register or special bit to adjust what would be displayed. If you wanted to change what was displayed you needed to change that value as the TV scan line ran across it. Many games did part of their processing while the CRT beam was jumping from side to side (the horizontal sync or h.blank) and a bigger part of their processing when the beam jumped back to the top (vertical sync or v.blank)

Some games would intentionally skip every other line so they could use that time for processing. A few used the alternating rows for special effects. That made them effectively 30 hz.

The first thing to keep in mind is that most older consoles don't "render" graphics, which is to say that they don't have frame buffers that they write image data into before displaying it. Instead, they process graphics data based on the settings of the control registers to generate the video signal scan-line by scan-line in real time. This is why they have strict graphical limitations on things like the number of background layers or the number of sprites displayable on a single scan-line. However this also means that on these consoles, drawing time is basically irrelevant. (Some early consoles, like the Atari 2600, don't have any kind of video memory, and the graphics are controlled entirely by display register settings. Games on those systems have to constantly adjust those settings from scan-line to scan-line during the active display portion of the frame.)

The end result of this is that almost all games on pre-3D consoles run at the display's refresh rate (60 Hz in Japan/North America/Brazil, 50 Hz in Europe/Australia) - with slowdown if there's a lot to process - though there are exceptions.

In the console world, it's not until the rise of 3D (N64, PS1) that it becomes commonplace for games to not run at 60 Hz - starting with these consoles, render time is also a factor in frame rate. Many games, in the interest of graphical fidelity, start allocating larger chunks of time to rendering, and as a result most games run at 30 Hz or even 20 Hz (especially common on N64 games), dropping even lower under heavy graphical load. However there are exceptions here too. For example, F-Zero X on the N64 runs at 60 Hz - and note the visual sacrifices they had to make to maintain that frame rate - and almost all 2D games of that era, like Castlevania: Symphony of the Night, also run at 60 Hz.

Wow, that's a lot of information. Btw I made a typo. I meant to compare n64 vs psx (not amiga vs psx). But I got my question answered. Thanks.

Slightly newer but when I worked in the XBox/PS2 era it was considered a hard requirement (where I worked anyway and I think it came from the publisher) that THOU SHALT NEVER DROP A FRAME. So it had to run at 50/60Hz no matter what. Which is pretty hard to do.

I remember we used to have meetings where we'd watch new games to get ideas and keep up with the industry, one game was quite unusual in running at 'half speed' and garnered quite a lot of discussion.

The first thing to keep in mind is that most older consoles don't "render" graphics, which is to say that they don't have frame buffers that they write image data into before displaying it. Instead, they process graphics data based on the settings of the control registers to generate the video signal scan-line by scan-line in real time. This is why they have strict graphical limitations on things like the number of background layers or the number of sprites displayable on a single scan-line. However this also means that on these consoles, drawing time is basically irrelevant. (Some early consoles, like the Atari 2600, don't have any kind of video memory, and the graphics are controlled entirely by display register settings. Games on those systems have to constantly adjust those settings from scan-line to scan-line during the active display portion of the frame.)

The end result of this is that almost all games on pre-3D consoles run at the display's refresh rate (60 Hz in Japan/North America/Brazil, 50 Hz in Europe/Australia) - with slowdown if there's a lot to process - though there are exceptions.

In the console world, it's not until the rise of 3D (N64, PS1) that it becomes commonplace for games to not run at 60 Hz - starting with these consoles, render time is also a factor in frame rate. Many games, in the interest of graphical fidelity, start allocating larger chunks of time to rendering, and as a result most games run at 30 Hz or even 20 Hz (especially common on N64 games), dropping even lower under heavy graphical load. However there are exceptions here too. For example, F-Zero X on the N64 runs at 60 Hz - and note the visual sacrifices they had to make to maintain that frame rate - and almost all 2D games of that era, like Castlevania: Symphony of the Night, also run at 60 Hz.

This is all correct, but I think the distinction between "rendering" and what older systems did is a bit artificial, or at least blurry. For example, there are a fair number of SEGA Genesis gamesthat actually do render to an off-screen buffer before displaying the image, although it's done in a fairly hacky way -- backgrounds are made of 8x8 tiles (probably due to being essentially a modified text mode), but if you ignore the tilemap completely and write directly into the tile data, you can even display software-rendered 3D graphics.

I don't really see an essential difference between this and a modern GPU. In both cases you have dedicated video hardware that can perform certain options natively (whether it's sprites, linescrolling, etc. on old systems or polygons, shaders, etc. on new ones) at the expense of sacrificing some flexibility.

-~-The Cow of Darkness-~-

Well yes, there are some games from that era that used some of their tile RAM as a low-resolution, limited color-depth frame buffer which I left out for the sake of simplicity.

This includes games that just use it in a limited manner for special effects (like in The Legend of Zelda: A Link to the Past on the SNES, which renders the Triforce and maiden crystals at run-time using 3D models), as well as some that use it for main gameplay, like Zero Tolerance (Genesis) or Wolfenstein 3D (SNES). The latter type also often run at lower frame rates because the consoles don't have sufficient VRAM bandwidth to support that at 60 fps, assuming the CPU itself is not the bottleneck.

This topic is closed to new replies.

Advertisement