Sony’s PlayStation 5 and Microsoft’s Xbox Series X gaming consoles will deliver performance and capabilities previously seen only in high-end PC gaming rigs: We’re talking frames rates up to 120 frames per second (fps), resolution up to 8K (that’s 7,840 x 4,320 pixels), HDMI 2.1, and more.
While pretty much any TV outfitted with HDMI will be compatible with these new consoles, that just means you’ll get an image and be able to play games. Three factors are key to realizing all the visual excitement that this new hardware is capable of delivering: Your TV must have a high hardware refresh rate, it must exhibit low input lag, and it must be able to sync its frame rates with the graphics processor (GPU) in whichever next-gen console you decide to buy.
You can pretty much ignore 8K support as a requirement for these new consoles; frankly, we might never see a real-world game that supports that resolution. But let’s explore those three other factors now.
A TV’s true, aka hardware, refresh rate is the number of times it can completely redraw its entire screen. In other words, refresh rate indicates the number of times each second the TV can turn all its pixels on and off. And on an 8K TV, you’re talking 33 million pixels. Sadly, few vendors put this spec where it’s easy to find.
What you’ll get instead are proprietary metrics such as Motion Rate (Samsung), TruMotion (LG), MotionFlow (Sony), and so on. These specs reflect the hardware refresh rate, plus numbers representing the tricks used to remove judder, reduce motion blur, and compensate for other visual anomalies. Such numbers are indicative of the relative motion proficiency of a given company’s own TV models, but they’re useless for evaluating a TV’s potential performance with a next-gen gaming console.
Mentioned in this article
Samsung Q900-series 8K quantum dot TV (75-inch class, model QN75Q900RB)
What you want is a TV with a 120Hz hardware refresh rate. Dig around the user manual, manufacturer’s website, or other source until you find that spec. Why 120Hz? Hertz (Hz) is a unit of frequency equal to one cycle per second, so 120Hz equals 120 cycles per second. The new consoles will be able to render games at up to 120 frames per second—one frame per cycle—and if your TV can’t keep up, some of those frames will be wasted. Even worse, the TV has to deal with the excess in some way, deciding, for example—and if possible—which frames to display and which to not display. It’s a processing nightmare for the TV that generally leads to disconcerting visual artifacts that can undermine gameplay—especially in multi-player games where other players are using equipment not bound by the same limitations. That said, not every game will render at 120 frames per second.
The other reason you want a 120Hz refresh rate is to reduce latency, aka input lag. This is the amount of time that elapses between the instant your console’s GPU generates a frame and the instant that frame appears on your TV. Generally speaking, a TV with a 60Hz refresh rate will lag a minimum of 9 to 10 milliseconds when in game mode, which can be noticeable if not obvious.
A TV with a 120Hz refresh rate, assuming it’s fed a 120Hz signal, can halve that latency to a barely discernable 5ms. Most dedicated gaming monitors are rated to have around 4ms of lag, so 5ms isn’t much of a deficit. Alas, variable refresh rates, depending on the type (described below), can increase lag by as much as 10ms. Note also that lag can soar to an unbearable 100ms on TVs that don’t have a game mode that’s designed to remove lag. 100ms is like trying to sprint in sand. You need game mode.
The good news is that while a 120Hz TV will cost more than 60Hz set, the higher spec is no longer solely the province of high-end, multi-thousand dollar TVs. There are a number of very good 120Hz TVs that cost less than $1,000, and nearly all of them feature some sort of game mode.
Variable refresh rate
The images/frames in a movie or TV show are fed to the display at a constant rate. A game’s images, by contrast, are constructed and rendered on the fly; as such, they are not delivered to the display at a constant rate. Scenes with large amounts of detail or high numbers of objects might arrive at 60 fps or less, while less complex ones might hit the TV at the full 120 fps. The resolution and detail settings you configure in the game will greatly affect the equation.
Without variable refresh rate (VRR), the TV will refresh at a constant rate of 120 times each second even if the console’s GPU is generating only 92 frames each second. That can lead to artifacts such as screen tearing, where the next frame starts drawing before the old one is finished.
VRR allows the TV to synchronize its refresh rate to the number of frames being delivered at any given moment in time. So, if the GPU is generating 92 fps, the TV will refresh its display 92 times each second. If the GPU drops to generating 43 fps, the TV will reduce its refresh rate 43 to fps, and so on. You can see the difference between VRR and no VRR using this handy online demo at testufo.com.
The concept of a variable refresh rate originated on the PC side of the business and is currently implemented in a number of different forms: There’s FreeSync and FreeSync 2, from AMD; G-Sync, G-Sync Ultimate, and G-Sync Compatible, from Nvidia; and there’s even an extension to the older HDMI spec. Both the Xbox Series X and PS5 are built around AMD processors, so if you see FreeSync on the TV box, you should be good. FreeSync is more commonly found in TVs, although LG has adopted both FreeSync and G-Sync on some of its OLED TVs.
The problem with VRR, as mentioned above, is that while it eliminates artifacts, it can add latency. How much depends on the implementation, but it’s around 10 to 11 milliseconds for the commonly use FreeSync. You’ll need to decide which is more important to your gaming experience—a few visual artifacts or better response time.
VRR is now part of the the HDMI 2.1 spec, and any TV that supports HDMI 2.1 should be golden—emphasis on should. Do your research to be sure.
HDMI 2.1, implemented both the PS5 and Xbox Series X, offers all the capabilities a TV needs to accommodate a next-gen console experience at full performance—including the bandwidth for fast 8K UHD and pristine 7.1-channel surround. There is, however, a catch.
Vendors are free to implement any subset of the specification they please. In other words, a TV marketed as supporting HDMI 2.1 doesn’t necessarily mean it offers everything that the new Xbox and PS5 need. It probably does, but read the fine print just in case. Note that it’s also possible for a manufacturer to issue a firmware update for a TV that has HDMI 2.1 to add more HDMI 2.1 features, as long as the high-bandwidth chip is in place. A footnote from the LG webpage linked to earlier, for example discloses that “FreeSync may not be available at the time of purchase of this product.” Samsung has updated its 2019 model-year TVs that support HDMI 2.1 at least once.
While an appropriate implementation of HDMI 2.1 is ideal, you don’t necessarily need it. HDMI 2.0 already supports 120Hz 4K UHD, albeit with half the color data because it’s bandwidth constrained. Your games will be less vibrant—how could they not be with half the color data present—but you won’t know what you’re missing until you see it on another TV. Bandwidth isn’t an issue with HDMI 2.1, so it’s able to deliver all the color data. Any 8K display will have HDMI 2.1 in place.
High dynamic range and wide color gamut
High dynamic range (HDR) imparts a more dramatic difference between light and dark areas, and wide color gamut (WCG, aka “deep color”) makes for a richer palette. These effects can be visually dramatic, making your journeys into virtual worlds more intense, for lack of a better word. In fact, once you grow accustomed to them, standard dynamic range and normal color seem bland and washed out in comparison.
But it’s not enough for a TV to simply have these features, the display needs to be bright enough to render them effectively. So, look for a TV that can produce at least 700 nits of peak brightness (an LCD TV, that is, OLED TVs have self-luminant panels and don’t need to be as bright). Both types of TV should offer 10-bit color with support for HDR10 and/or Dolby Vision. HDR10+, a Dolby Vision-like standard with dynamic metadata, is nice to have as well. It’s likely that any TV with the 120Hz refresh rate you want will support WCG and at least HDR10, but check to make sure.
In a nutshell
To summarize. For the best gaming experience with your next-gen console, you’ll want a TV with a 120Hz hardware refresh rate, game mode, and variable refresh rate (FreeSync). HDR and WCG support is icing on the cake, and also definitely recommended for the most enjoyable gaming experiences.
Not that you can’t play on a 60Hz TV with gaming mode and no VRR, but if you’re ponying up $300 to $500 for a new console, matching it with the appropriate TV (or gaming monitor) will deliver the most gratifying experience and leave nothing on the table.
Going for an 8K TV will cost a lot more, and you’ll probably find yourself gaming at 4K or even lower resolution anyway just to achieve faster frame rates. An 8K UHD TV, however, is likely to have all the features you need. Also, the Samsung 8K TVs I’ve reviewed render 4K UHD content considerably better than native 4K UHD TVs, thanks to intelligent upscaling. That likely doesn’t apply to gaming; but hey, nobody buys a TV just to play games.