Somewhere along the line, thin and light supplanted picture quality as the secondary consideration for TV purchases; the primary consideration for most of us being price. Perhaps it was the memory of the CRT behemoths that preceded the LCD revolution, or maybe the difficulty in wall-mounting the first generation of heavier LCD TVs. But when the first lightweight, thin-panel LCD TVs showed up, they disappeared from store shelves in a hurry.
To create these thinner and lighter flat-panel TVs, as well as to conserve energy, TV manufacturers turned to LEDs, which produce an enormous amount of light for their size. They were deployed as distributed side-lighting at first, and then in arrays directly behind the panel, as shown below.
But there was a casualty: Color. Instead of relatively pure, saturated colors, you got light orange-ish reds, yellow-ish greens, and so on. If you’ve seen a LED flashlight, especially those from a few years ago, you probably have an inkling why. The light they produce is cold, harsh, icy… Pick your adjective. That’s because “white” LED light skews heavily toward the blue end of the spectrum.
There are of course red, green, and blue LEDs, but implementing a system using them is complex and expensive. It was far easier and cheaper to use the old method of shining a bright white light at a layer of color filters. When this was done using older wide-spectrum CCFLs (Cold Cathode Fluorescent Lamp), the results were pretty darn good. Reds were red, yellows yellow, etc. With white LEDs... Not so much. Some vendors held on to CCFL backlighting for their professional lines for that very reason.
Obviously, the problem isn’t bad enough to overwhelm thin-and-light’s appeal. We are talking about a phenomenon subtle enough that many people still don’t realize what they'e missing. But the whole “blue” thing did not go unnoticed by the industry. It has spent the last six or seven years gradually remedying the color situation, while of course charging more for the TVs with the fixes. This year they’re even started to address another common LED/LCD problem: Contrast. This is being done with HDR (High Dynamic Range). But I digress.
The efforts towards better color might have moved a little faster if OLED hadn’t been hyped as the future of flat-panel TVs. Sadly, RGB and even wRGB OLED have remained extremely difficult to produce in large sizes, and are therefore quite expensive.
Getting better all the time
There are basically two methods vendors use to improve color accuracy in their TVs. One is to improve the balance of the emanated spectrum by combining different-colored LEDs with different-colored phosphors. Sony has gotten quite good at this.
The second is the use of tiny crystalline semiconductors called quantum dots that intake light and re-emit it at a specific wavelength.The beauty of quantum dots is that implementing them requires only interdicting the light source with a bevy of the little beauties. That is, placing a layer of quantum dots between the backlight and the color filters and LCDs.
Implementing quantum dots is said to be relatively inexpensive, but so far, the industry has only rolled them out in the pricier models. For example, Samsung’s SUHD series and Vizio’s Reference Series.
Significantly better LED/LCD color and contrast are here. Thankfully. Improvements can be seen in the upper mid-range models, but mostly it remains an upper-crust feature. Hence, I recommend a large dose of patience for those like myself who don’t number themselves among the rich and famous; especially if you’re rocking an older CCFL LCD TV that’s not color-challenged.
Put succinctly: either pay the premium now or wait for the improvements to trickle down. Otherwise, expect a case of buyer’s remorse in a couple of years.