Menu

How Google Stadia encodes big data for real-time use

0 Comments

Gaming uses a lot of data, especially when streaming in real time. Here’s how Google Stadia handles the data required in its cloud gaming service.

Dan Patterson, CNET and CBS News Senior Producer, spoke with Majd Bakar, VP of Engineering with Google Stadia, about how the company encodes and decodes data in real time. Stadia, which was publicly released on November 19, 2019, is Google’s cloud gaming service. This interview, conducted October 2, 2019, is part two of a three-part series. The following is an edited transcript of their conversation.

Majd Bakar: We use an encoder–it’s called VP9–that is developed. It’s an open source encoder actually, but it is developed by Google, and we published it. It basically looks at every frame. So when we talk about images that are displayed, a frame is basically a picture. And usually when we say 60 frames per second, it means you are changing the picture 60 times every second. So you get that smoothness of transition.

SEE: Building an effective data science team: A guide for business and tech leaders (free PDF) (TechRepublic)

We look at every frame, and we go and find the information in that frame and how to best compress it. And we use stuff like DCT conversion and macroblocks where you go over every small piece of that frame; one by one, you identify where’s the information, how you can combine the information in a way that allows you to compress it. And usually these compressions allow you to compress sometimes anywhere between a factor of 10 to 1, or in some cases, 100 to 1, depending on what’s the image that you are looking at.

The key here is how you can do that in real time. Google developed this technology to be very, very efficient at both compressions. And then you have to decompress to display it to the user, but at Stadia, we’ve built specialized hardware that allows us to do this in real time so we can take less than a millisecond to encode every frame and be able to send it over that public internet and be able to decode it quickly. 

You’re always displaying–when we say 60 frames per second, it means every frame is being shown. There’s a new frame that’s being shown every 16 milliseconds, 16.6 milliseconds. That’s a very, very, very short amount of time. In order to do that, you need to be very, very efficient. It’s basically you have the GPU. We’ve worked with AMD to build custom GPUs for Stadia. Our hardware–our specialized hardware–goes after the GPU. Think of it as it’s a specialized ASIC.

Also see

image.jpg

Image: CNET

Leave a Reply

Your email address will not be published. Required fields are marked *