A few weeks back I turned on my television to find out it had stopped receiving two of the free-to-air channels I watched most often. All of the other channels still resolved with perfect, digital clarity, so I couldn’t work out why these two channels - out of four in packaged in a multichannel broadcast signal - failed to display.
On a page buried deep in the broadcaster's website I learned the reason: overnight they’d changed the encoding format for the broadcast stream. In the earliest version of digital television (in Australia, that is, in the USA YMMV), the broadcast signal used MPEG-2. That’s a signal that every digital set can receive and display.
But because this is digital television, and the broadcast signal is simply stream of data, the folks behind the specification decided that a broadcaster could change the encoding for the stream. A decade ago, that encoding might represent the difference between a standard definition and high definition picture. But over the last eighteen months, broadcasters have begun migrating away from the universally acceptable MPEG-2 format into the newer, smarter and more efficient MPEG-4 format.
Make no mistake: MPEG-4 is much better than MPEG-2, because it gives broadcasters a much richer selection of data encoding for their signals. MPEG-4 opens the door to H.264 encoding - much more efficient for high-definition broadcasting, and giving a broadcaster more capacity within their limited bandwidth.
The broadcaster that disappeared from my television screen had done exactly that - taken a high-definition MPEG-2 channel, and replaced it with two high-definition MPEG-4 channels. A very neat trick – at least for televisions that could decode MPEG-4.
That morning I learned my television hadn’t been designed with any forward compatibility. Built to receive MPEG-2 broadcasts, it simply didn’t know how to decode the new signal.
That’s not something I’d seen before. In the era of analog television, new signals piggybacked on existing signals. Colour information lurked inside an unused bit of the video signal - so that it could be safely ignored by black & white televisions - while sound hid away on an FM sideband that easily grew to encompass stereo broadcasting without forcing a generation of televisions to go silent.
Before you think this is simply a middle-aged “back in my day..." rant, stop and consider: designing for backward compatibility was a basic goal of analog television broadcasters, one that has clearly been abandoned in the transition to digital.
Viewers always want the best picture and sound quality, while a broadcaster is always going to want to make the best use of a limited bandwidth resource. But stuck between these demands, should a $1500 television suddenly become obsolete?
Part of this problem rests with the manufacturers of televisions, who stubbornly persist in the belief that they’re making fixed appliances that never change. The digital television specification is a process, not a product, and any device embodying that specification must be upgradeable. My television has never had any firmware upgrades - because who upgrades the firmware on their television?
In the end, I made the decision to invest another $1500 in a brand new television. This time my money bought me a 4K UHD HDR television, with all the bells and whistles including MPEG-4 decoding.
This television, however, has a bit more of a future, because the manufacturer put Android TV onboard, embracing a fact that now seems absolutely obvious: a television is a tablet computer without a touchscreen.
A television with an operating system is both a bit ridiculous (taking a minute to go from cold boot to signal reception, it harkens back to times when tube televisions had to ‘warm up’) and absolutely essential, because it means that the televisions has an upgrade path. Even if the manufacturer never issues another firmware update, I’m probably good into the future, because it runs apps which will be updated.
Case in point: the Netflix app took a week to figure out that I now watched on a UHD device, and offered to upgrade my service to accommodate 4K streaming. Although I immediately agreed, I had no expectation that my pokey 10 Mbps broadband could handle anything that challenging.
Yet, after a few minutes of streaming House of Cards, my television picture tightened into eye-popping 4K clarity. Netflix found a way to squeeze all those political pixels into 8 Mbps - a feat only possible because its app can decode HEVC, the latest-and-greatest way to jam more picture into limited bandwidth.
As ultra-high definition televisions become more common, broadcasters will be tempted to migrate to HEVC, leaving millions with suddenly obsolete televisions. With 8K broadcasting coming at the next Summer Olympics, apps may be the only way forward that don’t leave us staring at a blank screen. ®