Mar 21, 2019
Google Stadia is Google’s new games platform based on Youtube. It demoed Tuesday and will launch later this year. It competes with Playstation, Xbox and Nintendo for the $67B console market and it may also eat into the mobile gaming market, itself a $70B market. Yes, gaming is the biggest content market in the world. It is a cloud-based gaming platform, done very differently than its Microsoft and Sony competitors, because it is built on Youtube and Google’s position in the world’s broadband networks and peering/transit interconnection.
I wrote this because I wanted to understand what leverage Google has to break into this market and how it will affect telecom companies and internet service providers. My conclusion, most of this is built on Google’s datacenter play and access providers will see a wave of traffic coming. This is the killer-app for FTTH/Docsis 3.1. Sony and Nintendo can go home and Microsoft needs to become a cloud player for real now.
What is Google Stadia?
What makes Stadia different than current gaming is that any device that can run Youtube in HD at 30/60 frames per second is good enough to game on. It just needs bandwidth (roughly 15–25Mbps). If it’s a laptop or PC, any controller connected to it can work. A Chromecast would work too, turning any display into a gamestation. The controls then will have to come from a dedicated controller, that looks just like what Sony or Microsoft offers, but is different, because it hooks up to your WiFi. Yes, the controller connects directly to the home network and Internet! All the processing is done in Google’s cloud and they send a 25Mbps Youtube video-stream to your device. A Samsung or LG TV would know from the URL which clip of your gameplay to request and receive. So for the price of $30-ish for a controller, your TV could display games that normally would require a 300 dollar console.
What sets Google apart in my opinion is that it treats games as Youtube video clips/web-pages that you as a player or viewer can interact with in real time. Any game session that is played can be broadcasted to, viewed and interacted with by others. Your friends could be playing this game and if they send you the URL via Whatsapp you could jump straight in from a mobile device. Or your friend is happy she did the Kessel Run in 12 parsecs and you could copy the starting point and do the same thing but try to beat her. Watching gaming via Youtube is already massive. Now you can join, broadcast, copy, play, monetize and create straight from watching someone else. You can embed the gameplay into any webpage or online video platform.It merges the two platforms and could create a massive interactive online experience. (and boy will we find some new forms of abuse)
How does this affect Google’s infrastructure?
Online gaming has always put stress on broadband networks not for their bandwidth demands, but for their latency demands. These two are linked to some extent, because if you need to send 1MB of data over a 1Mbps line, it will take a minimum of 8 seconds. Physics is what it is. However here it is relatively small inputs, but many of them, in rapid succession from the player and then a stable stream of 15–25 or more mbps back from Google. This puts strain on broadband networks. They have to find a way to optimally deliver traffic to Google with the shortest path possible. And vice versa they have to deal with massive incoming streams of 15–25Mbps per player to display HD and more for 4K.
The incoming streams are completely different than Netflix’s streams at similar resolutions. Netflix’s streams are optimally compressed to take as little bandwidth as possible and still give a great image. It is quite robust too. If Netflix needs to pre-load 30 seconds, it doesn’t matter, your TV has enough memory to cache this. If bandwidth deteriorates, so will your image quality, they just send you a different version of the same movie, at slightly lower quality. Plus, Netflix has OpenConnect, its CDN that is deep into most metrocore locations of telcos. Traffic often has to travel less than 30km to your home. Each instance handles up to 35Gbps/~7000 streams per 1u rack unit (equivalent to 21K-42K customer premises) and saves telco CTO’s thousands of dollars per node in uplinks as traffic is handled locally.
All of this will be harder to do for Google. The player will expect a maximum roundtrip time of 200ms, preferably less than 150ms. Each game and each player is different and that messes everything up. You can forget about caching, pre-loading, compressing, because everything that the fingers do now, needs to be shown on the screen now. Google has the same infrastructure for Youtube and caching video as Netflix has, together with Akamai they are the Big Three of bits. Google says it is in over 7500 locations in the world. However that infrastructure is based on the idea we’re not that original and roughly watch what everyone around is is watching too. Amazon, Facebook and Microsoft have big heads in the cloud, but they aren’t in every metro-core location, because they push less data and don’t care so much about latency.
So, the big question is how Google will deliver on the promise that it can handle gaming of any kind. Will it actually deliver a number of gaming racks in each of its 7500 edge nodes? Or will it deliver its connectivity from the over 100 peering and interconnection points it has in the world? Or will it be in the datacenters it has around the world.
Latency is the killer and it will have to deal with a lot of factors. Eurogamer has an overview of some latency figures, they achieve 166ms on a laptop playing a game on Stadia. Which would put Google on par or better with a local instance of an Xbox One X! If we break that 166ms down. We get the following:
- 10ms Wifi delay (a number Eurogamer uses, but I don’t see on my own and my employer’s Wifi with FTTH. 9ms roundtrip time to fast.com and 10 to AMS-ix.net is fine.)
- between 40ms and 80ms to process inputs and calculate all the game state and encode it. based on how much time Call of Duty and Doom needed for it. (apparently it takes the Xbox more time, so it acts as a nice extreme case.)
- 20 ms delay in encoding Youtube stream.
- 20 ms delay in displaying the video on a TV-screen.
This would leave Google with roughly 36ms to 76ms for round trip time from user to server and back. Which is actually more than I expected. To reach Sweden’s Sunet.Se I need 29ms round trip and to reach Cern.ch it’s 28–35ms from the Netherlands. To travel that distance my data has passed a number of Google datacenter’s already. The Netherland’s now has one Google datacenter and soon two. Denmark will soon have one and Belgium is expanding. There are likely some more builds across Europe.
The better the network performs, the more space Google gets to allow developers to trade-off latency with graphics. A developer that has a slow paced, but scenic game will use the latency budget to squeeze out more image quality. A twitchy football game will likely sacrifice some image quality for reduced latency. A battle royale game with lots of simultaneous players will use it for intra-server latency and syncing of state.
However, this also shows that it is really unclear why Google mentioned its 7500 Google caches at all. Why would these devices have any use at all, unless maybe to serve the video content of people watching other gamers. However, for the game developers, this would be rather useless to know. So yes, it’s fantastic Google has a massive cache system all around the world, but it’s the datacenters that appear to be the basis of the success.
BTW I would love to see a guestimate of how many PS4 equivalents could stuff in one rack. Also a guestimate of how much it saves in packaging, transport and retail of boxes to consumers would be welcome. My gut feeling is that just by cutting out retail, logistics it saves half of the cost of a PS4. That figure is than halved again, because the systems are shared resources. So roughly a PS5 equivalent for 100 dollar per user. I’m also fascinated if developers can use TensorFlow and AlphaGo based AI (and if players want them to)
Why telcos/ISPs should be afraid
The impact on networks of Stadia shouldn’t be underestimated. If Google gets any decent uptake for this system, then the amount of traffic it generates per home can be staggering. Currently most users don’t peak past 100mbps unless their Playstation has to download the latest updates. On average during peak hours (20hr — 22 hr) consumers use roughly 1mbps, of which a third is Netflix and another third is Google. All of this is served locally. The average bandwidth for a Netflix series is 4–5mbps, slightly higher of course for those that watch 4K. Even if you have a couple of teens, it’s likely consumers won’t hit more than 20–30mbps sustained traffic over longer periods.
Stadia could change all that. It promises 25mbps per device for HD level play, more for 4K. Stadia could make it fun and realistic to play with multiple players in the same home. It wouldn’t be weird or difficult to have 4–8 kids playing on iPads, TV’s and laptops. That could well be 240mbps sustained for long periods. Even in normal circumstances having many homes reach 30mbps sustained for hours would hit the averages.
On a larger scale, if Google can deliver, it wouldn’t be odd to expect 2.5% of all households to play a game between 20hr and 22hr. If you have 4 million customers, that’s 100K streams in parallel or 2.5Tbps in additional traffic. BT was very proud of handling a new record of 12Tbps on its access network and KPN said it did 6.5Tbps peak over 2018. No matter how you look at it, it will require some upgrades of core switches and private network interconnects between Google and telcos.
Mobile networks would likely buckle under the load if every user really gets 25mbps. LTE was never designed for sustained delivery of a large number of concurrent streams. It’s fast because everyone gets their webpages quickly and then vacates the air to read the content they received. 5G is sweet on paper, but far from reality and also not designed 10–40 people 25mbps sustained per cell.
Another troubling bit for networks can be the surprise peaks during the day. Let’s say on a Saturday morning a famous Youtube star demo’s some great new game and challenges his viewers to defeat her in the game. She posts the link and instantly 10k kids respond. That’s 250Gbps that just has to be handled. Mind you, that’s less than 1 percent of the following of the major Youtube stars in the Netherlands. Of course you should be able to handle it, but still it is 10–20% extra traffic out of nowhere.
For both Google and Telcos this means massive, massive upgrades to peering and private network interconnects. CTOs will likely have to upgrade links between major cities and the PNI locations.
Conclusion
Google Stadia looks very promising. Google’s mastery of the datacenter lies at the heart of its challenge of Sony, Microsoft and Nintendo. It may deliver a gaming environment at a quarter of the cost (first guestimate). Neither Sony, nor Nintendo are datacenter companies and Microsoft would need to change quite a bit in its approach. The plan looked too ambitious at first, but it appears Google Stadia could well deliver on its promises as latency is likely not an issue in modern broadband networks.
It is the networks themselves that will have to worry. FTTC/VDSL just doesn’t cut it in many instances. It may deliver one stream, but at the expense of all other traffic. Docsis 3.1 promises 10Gbps shared over a number of customers. That number of customers that share may have to be a bit lower than expected by the CFO. FTTH networks should be able to handle the load. All networks will have to work on uplinks, peering and interconnection. None of the traditional tricks of generating and keeping traffic local will work anymore. It is coming from Google and it’s a flood.
A more technical analysis below that is the basis for Eurogamer’s review.