Google's new VP9 video compression codec is great, and a little astonishing. It promises identical video quality, in half the file size. This means you'll use half the wireless data to watch the same quantity of video. It means you'll wait half as long for a video to buffer and start playing, or you can maybe watch a higher quality version in the same time. It means 4K video streaming is really doable over home internet connections. And VP9 isn't some far-off promise: it's here right now, in your friendly neighborhood YouTube player.

According to Google's metrics from a couple weeks ago: "YouTube users have already watched more than 25 billion hours of VP9 video." VP9 is available in the Chrome browser, the Samsung Galaxy S6, and a few TVs.

But there's a tradeoff. VP9 is much harder to decode than Google's last codec, VP8. In fact there's often a direct correlation between how elaborately compressed a file is and how hard it is to unravel that compression and use the file.

This means more computation is required to watch VP9 YouTube videos, and, if you're using a laptop, tablet, or phone to watch VP9, that extra computation usually means less battery life.

I'm not trying to pick on VP9 here, I'm just using it as an illustration. Because we actually make these sorts of tradeoffs all the time.

Most of the time, we don't think of internet problems being computational. "The WiFi is down." "The internet's being slow." "The connection dropped." These sound like network issues, and they are. But a functioning internet is not unrelated to your CPU. Any piece of data that goes over a network has to be prepared, sent, received, and interpreted by silicon.

A functioning internet is not unrelated to your CPU.

With VP9, there's a pretty simple formula for deciding if it's a good idea: is YouTube slow for people because of their available computing resources, or because of their available network resources? As it turns out, most people are limited by their network resources. Especially in emerging markets.

In fact, the speed of our networks themselves is very much defined by available computational power. For instance, LTE requires highly complex, super-fast, power-hungry chips to decode messy radio waves into high throughput data for your phone. LTE squeezes more internet over the same radio spectrum, which was previously saturated with more "naive" cellular internet implementations, but it does so at the cost of computation.

And now for a word of warning: this tradeoff won't always work, or at least it might not always work. It's worked so far, certainly. At first there was MPEG-2, which could fit a whole movie onto a DVD. Then MPEG-4 Part 2, which enabled the nascent emergence of online video. Then H.264, which heralded Blu-rays and YouTube alike. Each codec fit more quality in less space, and each codec was more computationally complex to decode. But, thankfully, our computers kept getting faster, and so we barely noticed.

"The Free Lunch Is Over" is a highly influential paper in computer science from 2005. It predicted (accurately) the inevitable tailing off of Moore's Law ("Light isn’t getting any faster," as the paper points out), and looked for a new way to design software to take advantage of the years Moore's Law has left.

Computers can still get faster, and they can still get more power efficient, but it's increasingly hard to improve both metrics simultaneously. The trend in laptops, exemplified by Apple's new MacBook, is to offer "good enough" computation with super-great battery life in a tiny package. As long as battery technology stays stagnant, this is likely to be a trend across all our mobile devices.

We might have to choose between "good enough" computation speeds, "long enough" battery life, and "fast enough" internet experiences.

Video codecs like VP9 can benefit from specialized hardware, which is designed specifically to decode video without using too much power. VP9-optimized chipsets for phones, laptops, and TVs are already hitting the market. Existing hardware, however, will have to hobble along with warm CPUs and shortened battery life.

Maybe in a few years there will be a reckoning. We might have to choose between "good enough" computation speeds, "long enough" battery life, and "fast enough" internet experiences. Thankfully, improvements to anything — fatter pipes, more efficient chips, better designed software, miraculous batteries, or even a decreased enthusiasm for 4K cat videos — will mean an improvement to everything.

Tradeoffs, you know?


About Paul Miller

That guy who left the internet for a year