As we have come to expect, the picture quality of flat-panel TVs improves with each generation, while their performance differences diminish. This was clearly demonstrated at the 2019 Value Electronics TV Shootout, held June 12 during CE Week, a mid-year gathering of consumer-electronics companies and professionals held at the Javits Center in New York City.
The annual event was started in 2004 by Robert Zohn, owner of Scarsdale, New York-based retailer Value Electronics, and this year’s shootout was a real squeaker. Four flagship TVs were judged by eight video professionals in four categories, each with the same four performance attributes, and many of the scores were closer than I’ve ever seen.
The overall winner was the Sony XBR-65A9G OLED TV. The LG 65C9P OLED, however, came within five percent of the A9G’s scores in many attributes across the four categories.
The contenders for the 2019 Value Electronics King of TVs included four 65-inch 4K, HDR-capable models: the LG 65C9P OLED ($3,499.99), Samsung QN65Q90R LED-backlit LCD ($3,499.99), Sony XBR-65A9G OLED ($3,799.99), and Sony XBR-65Z9F LED-backlit LCD ($2,999.99). Both OLED TVs use a WRGB panel supplied by LG Display, but each company applies its own electronics and processing algorithms. The Samsung Q90R uses quantum dots in its backlight, while the Sony Z9F employs a conventional LED backlight. In addition, the Samsung has more than three times the number of local-dimming zones as the Sony.
If you’ve been following the Value Electronics TV shootout lately, you might recall that the Sony Z9F was a contender last year. So why include it again this year? Sony’s ultimate flagship LED-backlit LCD TV for 2019 is the Z9G, an 8K display available in screen sizes of 85 and 98 inches, so it would be unfair to include with 65-inch 4K displays. The Z9F is Sony’s flagship 4K LED-backlit LCD TV. It’s available in a 65-inch size, and it will remain in Sony’s lineup throughout 2019.
Also in the lineup was a Sony BVM-X300 professional monitor, which is used in many color-grading and mastering facilities. In fact, many of the movies and TV shows you know and love were mastered on an X300. It’s a 30-inch RGB OLED monitor with a peak light output of 1,000 nits and a price tag of $42,000. It was not in contention with the other sets, of course; it was used as a reference. One primary goal of the shootout was to determine which consumer TV came closest to matching the image on the X300—in other words, which TV most closely reproduced the image intended by content creators.
After 200 hours of break-in, all four contenders and the X300 were fully calibrated by Kevin Miller, one of the top ISF-certified calibrators in the business, and John Reformato, Value Electronics’ in-house, ISF-certified calibrator. Their tools included the Jeti Spectraval 1511 and Colorimetry Research CR-250 spectroradiometers to profile Klein K10 colorimeters along with the latest version of SpectraCal’s CalMan software.
They started by calibrating the X300, applying the so-called Judd modification. For those who are unfamiliar with this modification, it came about when video professionals noticed that RGB OLED monitors had a greenish cast when calibrated to the standard D65 white point (CIE coordinates x=0.313, y=0.327) as the grayscale target. The greenish cast is a perceptual effect that occurs because the spectral power distribution of RGB OLED displays differs from that of other types of displays. Using research conducted by physicist Deane Judd in the 1950s and extensive perceptual testing, it was determined that a white point with coordinates x=0.307, y=0.318 yields the most neutral-looking white on RGB OLED displays.
Next, Kevin and John displayed a 25-percent white window at 100-percent luminance on all five screens and adjusted the grayscale-gain control on each consumer TV so that the color of white visually matched the X300 as closely as possible. This was done entirely by eye—and both calibrators have highly trained eyes.
Once the TVs all visually matched the X300, they measured the coordinates of the resulting white point of each TV, which were not necessarily the same for all of them. Finally, they performed a full grayscale calibration on each TV using its measured white point as the target. Fortunately, CalMan lets calibrators create a custom color space with any white-point coordinates.
They calibrated the Samsung Q90R with its 20-point manual controls. The LG and Sony sets offer automatic calibration with CalMan, which Kevin and John used on the A9G, C9P, and Z9F. They didn’t use autocal on the Samsung because it’s not yet implemented in CalMan for the 2019 Samsung TVs.
Kevin noticed that the Samsung HDR EOTF (PQ/SMPTE 2084) (EOTF is an acronym for Electro-Optical Transfer Function, which is the process by which a display converts a video signal into light) did not track perfectly, so he lowered the set’s contrast control a bit, which brought the EOTF in line with the target. That reduced its peak luminance a bit, but it also improved the bright detail. Also, the Samsung’ s image matched the X300’s beautifully in HDR.
The distribution system used during calibration and the shootout itself was provided and operated by Matt Murray from AVPro. Source devices included AVPro’s new Murideo Seven test-pattern generator (this was its first public appearance) as well as a previous-generation Murideo Six-G pattern generator. Another source of test patterns was the new Spears & Munsil UHD Benchmark test disc (which is about to be released for sale to the public), played from an Oppo UDP-203 UHD Blu-ray player.
Real-world content was provided by a Kaleidescape Strato S server, which plays titles downloaded from its store that are at least as good as—and, in many cases, better than—the same content on UHD Blu-ray. This server does not support Dolby Vision, however, only HDR10.
Finally, two Panasonic DP-UB9000 UHD Blu-ray players were used to play clips from Aquaman in Dolby Vision on the Sony and LG TVs, and HDR10 on the Samsung and X300, which do not support Dolby Vision. I’ll discuss this in more detail shortly.
All the sources were connected to the inputs on an AVPro Edge AUHD 16x16 HDMI switcher. Outputs from the switcher were connected to the calibrated inputs of the TVs and X300. The cables were all 10-meter Metra Home Theater Velox passive (copper) HDMI cables, which can convey a maximum of 24Gbps—more than enough for HDMI 2.0 at 18Gbps.
The content we watched during the shootout included test patterns from the Murideo pattern generators and the UHD Benchmark test disc. Among them was a 4x4 ANSI checkerboard, SMPTE color bars, various motion-resolution patterns, zone plates, and others.
For real-world content, we used clips from a variety of movies in UHD/HDR and HD/SDR. They included Mad Max: Fury Road and The Revenant for dynamic range; Aquaman, 2001: A Space Odyssey, and Planet Earth for color saturation and accuracy; Baby Driver and Mission: Impossible—Fallout for video processing and motion resolution; The Art of Flight (HD/SDR only) for color and dynamic range with the lights on; and Our Planet and Lucifer (UHD/HDR only) from Netflix for streaming.
Before the formal judging got underway, we wanted to take a look at the difference between Dolby Vision and HDR10. We cued up a clip from Aquaman in Dolby Vision from one of the Panasonic players to show on the LG C9P and the Sony A9G and Z9F, and the same clip in HDR10 from the other Panasonic player to show on the Samsung Q90R and Sony X300, neither of which support Dolby Vision.
Interestingly, the two HDR formats looked very similar, with much less difference than we expected between them—especially since they use different display technologies. This makes some sense on the Samsung, because it synthesizes its own dynamic metadata, but the X300 does not, and it looked surprisingly similar to Dolby Vision as well. Clearly, the HDR10 version was well mastered on that disc.
Here come the judges
There were eight official judges at this year’s TV Shootout, all of whom are professionals in the video industry. They included two TV reviewers: Channa De Silva, an independent reviewer at 4KHomeTheaterReview.com; and Greg Tarr, managing editor at HDGuru.com. From the content-production side, we had David Mackenzie, founder and video compressionist at Fidelity in Motion; David Medina, manager of media operations at HBO; and Giles Sherwood, post production supervisor and color scientist at Criterion. The remaining judges included Michael Reuben, retired Blu-ray reviewer at Blu-ray.com; Bill Schindler, a long-time video-industry consultant; and Brandon Yates, a video engineer and co-owner of Yates & Parks Consulting.
Joel Silver, founder and president of the Imaging Science Foundation, was instrumental in organizing the entire event. Throughout his extensive career, Joel has tirelessly advocated for the importance of video calibration to reproduce the artistic vision of content creators as closely as possible. He has been deeply involved in the TV Shootout for many years to assure that it adheres to that ideal.
The judges were tasked with evaluating each contender in four main categories: SDR Day (lights on), SDR Reference (lights off), HDR Reference (lights off), and Streaming (lights on and off) using each set’s internal Netflix app. The Streaming category is new this year; it was added because Robert Zohn and the other staff members recognize that the image quality of streaming has improved over the last few years. In fact, Kevin Miller, Joel Silver, and other staffers admitted they now watch streamed content in their home theaters, which they hadn’t even a year ago.
Within each category, each set was evaluated for four perceptual attributes: dynamic range, color saturation, color accuracy, and motion resolution. These attributes were defined by Joel Silver to include several specific elements. For example, the dynamic-range attribute included peak brightness, quality of black, and perceived contrast ratio, while color accuracy included color tracking, decoding, and processing as well as EOTF, which determines how accurate colors are at all brightness levels. The motion-resolution attribute included 2K-to-4K upscaling, motion processing with “enhancements” disabled, and noise reduction.
Results of the TV judging
After five hours of evaluation, the judges’ ballots were tabulated, and the results are in: The average scores for each attribute within each category, along with the winner of each category, are shown in the tables below:
As you can see, the Sony XBR-65A9G won the SDR Day, SDR Reference, and HDR Reference categories, and it tied the LG 65C9P in the Streaming category. But in many cases, the difference in scores between the top two sets—the OLED TVs—was within 5 percent. As Joel Silver remarked, “I’m glad I’m not a judge!”
The two LED-backlit LCD TVs were certainly no slouches—the Sony Z9F got the highest score for dynamic range in the SDR Day category. It had significantly higher blacks than the Samsung Q90R. Also, the Z9F exhibited much more haloing with a small bright object on a dark background, which is not surprising, since it has many fewer local-dimming zones than the Q90R. The Samsung, on the other hand, tended to crush low-level detail.
In the end, the Sony XBR-65A9G wears the crown as King of TVs for 2019, according to this panel of judges. But if you’re shopping for a premium TV, you can’t go wrong with either he A9G or the LG C9P.
The entire 2019 Value Electronics TV Shootout was captured on video; check it out on YouTube here.