What’s the Deal With Frames?
To ensure everyone’s on the same page, the frame rate of a video game is the number of unique sequential images displayed on-screen. These frames are “rendered” by the game’s device and sent to the display.
There are two main reasons that higher frame rates can improve the experience and presentation of video games. First, more frames equate to smoother motion on the screen. The lower the frame rate, the greater the gap in visible motion. Go low enough, and things look like a slide show, keep the frame rate high enough, and things look smooth.
How low is too low? Animation is typically done at 12fps (with higher rates for action scenes). Cinematic films are shot at 24fps, and TV content tends to be at 30fps. Action camera footage is usually captured at 60fps since it’s likely to contain fast motion.
This may sound like 12fps or 24fps should be fine for a video game, but it’s only half the story. Since video games are interactive and respond to the player’s control inputs, it’s about more than your eye can see. It’s also about how quickly the feedback from your eyes reflects what you’re telling the game to do using the controls.
This diagram indicates how often the game world updates at different rates. Notice how large the intervals are at 30fps compared to 60fps.
At 60fps, the on-screen game state updates twice as often as at 30fps. This means it should take half as long for your input and events in the game to be communicated to you. It’s helpful to think of frame rate as a “temporal” resolution. The more frames rendered, the more information you get about what’s happening.
Why 30fps or 60fps in Particular?
Frame rate numbers are arbitrary, and a console can (and does) produce frames at any rate up to its technological limits. So why focus on 30fps and 60fps? It has to do with the refresh rate of the displays we use and the broadcast standards for television.
Historically, in NTSC regions (such as the USA), the refresh rate rounds off to 60Hz, which means a maximum of 60fps can be displayed. In PAL territories, televisions were limited to about 50Hz, which equates to 50fps.
Here you can see the RF (Radio Frequency) modulator for an old Sega Megadrive marked as “PAL,” which means it expects a TV with a 50Hz frequency.
If a game could not run at those frame rates, the next best solution was to target half of the refresh rate since this allowed for evenly paced frames that matched the refresh rate by persisting for two refreshes per frame. If the frame rate does not exactly divide into the display’s refresh rate, it leads to stutter and “torn” frames. This is still relevant today, since most modern TVs are limited to 60Hz.
You might think, “but what about 24fps films on TV?” That’s a problem that still exists today, and different TVs have different ways to deal with this discrepancy. This often leads to choppy motion known as “pulldown judder,” and it’s why panning shots look so jumpy when a 24fps movie is shown on a TV.
These days Variable Refresh Rate (VRR) TVs and monitors solve this issue by letting the monitor alter its refresh rate to whatever the frame rate is, but it will be some time before this feature becomes the norm.
Some console games now offer a 40fps mode for players with a 120Hz display. Since 40 divides evenly into 120, but not 60. This offers a good middle ground between the snappy response time of 60fps and the lower hardware requirements of 30fps.
The Incentive for 30fps Is Powerful
The big question is, why are games now on current-generation consoles that only offer the option to run at 30fps (or 40fps) when they are so much more powerful than before?
There are two factors at play here. The first is that a console represents a fixed pool of performance resources. This means facing an iron triangle of the resolution, game complexity, and frame rate. Every game must balance these different aspects to produce a satisfactory final experience.
Since the vast majority of displays used by console owners are (for now) 60Hz displays, there are only two viable frame rate targets: 30fps and 60fps.
Note that 1, 2, 3, 4, 5, 6, 10, 12, 15, and 20 frames per second divide evenly into 60Hz. Unfortunately, no number between 30 and 60 divides into 60, and none of the numbers below 30fps offers what most players would consider a playable experience. So even though a given game might manage a stable 35 or 45 frames per second, there’s no way to display that correctly on a typical television.
So why not just target 60fps? This is where the fixed platform nature of consoles comes into play. The 30fps target offers the largest “frame time” at the lowest frame rate that players will tolerate. The more time you have to render each frame, the more detail and effects can be packed into the said frame. It also gives the CPU more time to do calculations that may not have anything to do with graphics. So, for example, your open-world game can have more sophisticated AI or physics simulations because the CPU has more time between each frame to do the work.
As each new game tries to outdo the next in visuals and features, having twice the frame time to achieve your goals is incredibly tempting. After all, screenshots and 30fps trailers don’t show off high frame rates!
People Don’t Change
Will there be a point where 30fps console games are a thing of the past for good? The problem here isn’t one of technology but one of people. As long as the average console player is happy to play a game at 30fps, then developers will be happy to take advantage of that large frame time to paint their canvas.
30fps will likely remain the lower acceptable bound in perpetuity. However, as TVs and monitors that can display at any arbitrary frame rate becomes widespread, we’re likely to see more games swing freely between 30 and 60 frames per second or run at fixed frame rates such as 45 or 50. Still, we suspect that even future consoles with ten times the power of today’s systems will gravitate to the lowest frame rates that players will accept in pursuit of the most dazzling visuals possible.