The New England Patriots weren’t the only ones in agony during the Super Bowl’s thrilling conclusion this week.
As the clock struck 10 p.m. on Sunday, some Hulu viewers received an error message, saying there was nothing left to watch. Hulu hadn’t accounted for the game running long, which led to a scheduling glitch that caused viewers to miss Tom Brady’s final attempts at a comeback.
While the problem in this case was atypical, reliability issues are all too common in live streaming TV, which has many more potential points of failure compared to a traditional cable telecast. And unfortunately for consumers, the only way to learn this is the hard way. Services like Hulu and Sling TV will tell you all about the features, channels, and low prices they offer, but they won’t make any claims about how dependable they are.
It’s time for that to change. In addition to competing on features and pricing, streaming TV providers should open up on reliability. Similar to how wireless carriers jockey for “most reliable network” bragging rights, streaming providers should compete on their basic ability to carry a live TV feed.
A matter of measurement
The reliability issue was on my mind even before the Super Bowl, having just written a head-to-head comparison of all live TV streaming bundles. For those reviews, I compared channel lineups, features like DVR, and the overall user experience to determine which bundle was best.
Still, I couldn’t properly evaluate one crucial factor: How likely are users to experience buffering, dips in video quality, crashes, or other issues that affect video playback?
As an individual reviewer, there’s no holistic way to measure this. Personal experience or reader feedback can be affected by external factors, such as faulty Wi-Fi or problems at the internet service provider level. Issues affecting one part of the country could be nonexistent elsewhere, and results can even vary by channel or program. Scouring for outage reports on social media or in the press isn’t fair either, since the services with the most users are likely to draw more complaints.
What we really need is large-scale testing to find out how these services perform overall. But while third-party firms already offer this kind of service to streaming providers, they seldom publicize the results. Streaming providers can use the measurements to improve their service, but in the meantime, customers are in the dark.
There is a glimmer of hope that streaming providers might start holding themselves accountable: The Consumer Technology Association, a major tech industry trade group, is now developing “Quality of Experience” measurements for streaming video services, based on recommendations from another trade group called the Streaming Video Alliance. Within three to six months, they hope to have a standard way to measure things like startup speed, smoothness, video quality, and the ability to play an entire program without interruption.
“The idea is to support these common, agreed upon metrics so that everybody’s doing the same thing and reporting in the same way,” said the CTA’s senior VP of research and standards, Brian Markwalter.
On their own, these standards won’t solve the problem. They’re mainly designed for internal use, so that streaming providers can measure their service in a way that jibes with the rest of the industry. But perhaps with standardized measurements, streaming TV provides might feel encouraged to publicize more of their own results, especially if they’re positive.
“Until we can measure these consistently and reliably, it’s hard to help consumers,” Markwalter said. “This is the beginning of being able to do that.”
Incidentally, while working on this column, I received a memo that circulated internally at FuboTV, boasting of a “record low” 0.49 percent buffer rate for its Super Bowl stream. The methodology wasn’t explained, and there was no way to tell how it compares to a typical day’s service from Fubo or others. But when I asked the Streaming Video Alliance’s executive director Jason Thibeault if 0.49 percent was a good number, he said it was “incredible.”
So here’s my hope: Once the industry settles on a way to measure the reliability of streaming TV services, we’ll start to see more bragging when those services are doing well, which in turn will make it easier for reviewers and other observers to ask about how other services are doing by comparison.
In the end, we might get something that resembles a “reliability score” that applies to all streaming TV bundles. Even if the occasional glitch still happens, choosing the best streaming bundle won’t feel like such a crapshoot.
Sign up for Jared’s Cord Cutter Weekly newsletter to get this column and other cord-cutting news, insights, and deals delivered to your inbox.