Cloud Gaming

Eyevinn Technology
8 min readMar 20, 2019

--

In this post we’re explaining what Cloud Gaming is, what advantages and disadvantages it has compared to normal gaming and what obstacles we must overcome for it to become the norm.

If you are not familiar with the terms Input-lag or response-time, then scroll down to section “Differentiating between Input-Lag & Response Time” before continue reading.

Written by: Thomas Demirian / Eyevinn Technology

Imagine a world were any “smart” device could play your favorite games at their highest graphics fidelity. Now imagine this being a subscription service where no upfront investment of buying an expensive gaming console or PC is required. Finally visualize your gaming library being nearly infinite.

This scenario is no longer unattainable and it’s not farfetched thinking this could be the norm in a not too far away future using Cloud Gaming.

What is Cloud Gaming?

Up until now the computational power needed to run your gaming session has depended on the investment you have made in dedicated hardware. The more you spend the better performance you get. As time passes newer and more demanding games are released which makes your investment depreciate in value in sort since you are no longer able to play your newer games with desired graphics and performance.

Using a Cloud Gaming service, you no longer need to own the computational hardware since it will be owned and operated by the gaming service. All that is expected of you is to select the game you want to play and start pushing away on your buttons. Your input commands are sent to the server, calculated in the game, rendered to video & audio streams, and then streamed back to you.

Video quality and Latency

When playing on dedicated hardware connected to your monitor, video artifacts in the image and input latency from pushed button to executed command on screen are kept to a minimum. The video quality corelates directly with your hardware’s computational power and the game engines ability to render out a nice image. The latency on the other hand depends mostly on your in-game frame-rate, the way the game engine handles game logic and your monitors signal processing time.

With that said it’s not as simple as assigning a general latency value for all games and hardware configurations. However, for the purpose of this comparison we will simplify the setup to better distinguish the difference between local gaming and cloud gaming.

Local Gaming Input Lag

One of the major contributors to latency is the frames per second the game is rendered at. In the example below, we will use 60 fps which will make every frame visible for 16.7 milliseconds. Note that many games today on consoles runs at 30 fps which makes every frame visible for 33.3 milliseconds.

  1. The user is playing a game and want to interact by pressing a button on the controller. The signal is sent to the Console / Computer. The 10ms is an approximate number since the signal latency differs depending on if the controller is wired or wireless and lots of other factors.
  2. When the signal is received by the Console / Computer the game logic needs to be calculated based on the input and then rendered out. In an optimized game engine playing a computational hungry game, this usually takes 3 frames.
  3. The average input latency for a display that the casual user has is about 30ms. Not to be confused with the displays refresh rate or response time which are always faster and will be covered later on in the article.

In other words, it will take your display 90ms to update according to your input command. That is 5 frames or almost a 10thof a second. Note that if we were rendering at 30 fps, then we would need to add an extra 50ms since each frame stays on for double the time.

Cloud Gaming Input Lag

Input-lag in cloud gaming is a different story since more processes needs to be accounted for. As can be seen in the flowchart below the process is not as straight forward anymore.

  1. First off, just as with local gaming an input command needs to be sent by the user.
  2. It’s received by the user’s cloud gaming hardware and then passed along to the cloud servers where the actual game is being hosted.
    a) This step will be faster in situations where your controller is the same as your gaming device, like on mobile phones.
  3. The game engine receives the input command and the game logic needs to be calculated and then rendered out.
  4. The rendered signal is passed to an encoder that encodes the material to a suitable Audio & Video codec that is then passed back to the users gaming device.
  5. The received Audio & Video codec is decoded and rendered out, then passed to your monitor.
  6. Your monitor does its internal signal processing and then displays the image.
Above is a simplified solution for Cloud Gaming showing accumulated latency

Differentiating between Input-Lag & Response Time

Input-lag and response-time can easily be mixed together. Below are 3 pictures that hopefully clears things up.

Input-lag comparison between 2 monitors from executed command to update on display.
Response timefor a pixel to change state. Left pixel simulates pixel response time. Right pixel simulates how it actually looks in slow motion.

As you might have figured out Input-lag determines how fast the game respond to the user’s actions while response time corelates to the presentation of the image. With slower response time on your display more ghosting artefacts will appear between frames since it takes longer for one frame to transition to the next. This means that the actual clean image that is supposed to be displayed for 16.7ms is actually only clean for part of the time.

Frame 1–3 illustrates optimal response time going from green to red to green again. Frame 4–6 illustrates how it actually looks today, displaying that the complete frame actually stays on for part of the 16.7 millisecond since it takes time to transition between the two states.

Video Quality

Just like the input-lag, video quality is also an ongoing battle for cloud gaming. Probably an even bigger one that will take much longer to conquer since the world is fragmented when it comes to hardware and bandwidth.

When playing on dedicated local hardware, directly connected to your monitor, a high bandwidth signal is sent displaying a crisp picture. This would be ideal for cloud gaming but due to a fragmented world when it comes to bandwidth and also the availability of hardware able to decode high resolution and high frame rate video, this is not attainable, at least not for now.

Pristine low latency video

In an ideal world an uncompressed high bitrate signal would be sent to the user delivering a super high-quality picture not distinguishable from the one received playing on local hardware. But due to bandwidth restriction around the world bitrate needs to be reduced while keeping latency low and quality high. As you already figured out, this is not an easy task.

Keeping latency to a minimum introduces restrictions on the video streams. B-frames cannot or should not be used since latency is greatly increased. Other parts of the stream need to be of lower complexity for the encoder & decoder to be able to encode & decode the high frame-rate & resolution in near real-time.

With these restrictions, increasing the bitrate solves the quality issue while it enhances the

delivery problematics since most homes around the world does not have access to cheap high bandwidth internet connections that are stable. And we are not talking about Full HD 7–8 mbit/sec streams here. We are talking 30+ mbits/sec, and that figure scales larger when fps and resolution is raised.

Another way to solve the problem is by using a higher performance codec. Today h.264 is the most common codec, and therefore using it in cloud gaming services that does not rely on its own dedicated on-premise hardware for decoding the signal is the way to go. The reasoning for this is that most devices today have chips that are able to decode certain profiles of h.264 on the fly. However, if the user has a device with newer chips that support decoding of higher performance codecs, then quality can be raised dramatically using the same bandwidth.

Some thoughts

Does cloud gaming fit into the same bucket as most of the things that sounds too good to be true? Well yes and no depending on who you ask, but the future looks promising and here is why.

First off, most casual gamers will never notice the extra added input-lag. The same group of gamers usually don’t care that much about pristine video quality either. And on top of that, casual gamers usually play games that will put less stress on the codec and therefore the stream will reach good enough quality with less bitrate.

However, if the goal is to please the group of gamers that are playing more demanding games on Consoles or PC, then quality must be increased, and Input-lag reduced before cloud gaming will reach mainstream. And for competitive gaming that is steadily growing, cloud gaming will remain a big no no since any added obstacle that can come in the way of victory is not accepted.

For countries that have widely spread, stable and fast broadband connections this might be attainable in a not too far away future since the video quality hurdle can partly be solved by increasing the bitrate and also by using newer codecs like h.265.

Solving the Input-lag obstacle can partly be solved with brute force by increasing the frame rate speed the game runs at. This will naturally increase performance needed to encode and decode the streams and also the need for even higher bandwidth speed since more data needs to be transferred.

Buying a newer monitor with faster signal processing or “gaming mode” will also mitigate the problem by shaving off some milliseconds of the input-lag. But beware, “gaming mode” will reduce the quality of the picture since less signal processing will be done.

Important to note, this article covered far from all obstacles left for a pristine cloud gaming experience. We haven’t even touched upon the topic of avoiding screen tearing with methods like V-Sync, G-Sync, FreeSync or Adaptive-Sync which will most likely add their share of extra input-lag.

Eyevinn Technology is the leading independent consultant firm specializing in video technology and media distribution, and proud organizer of the yearly nordic conference Streaming Tech Sweden.

--

--

Eyevinn Technology
Eyevinn Technology

Written by Eyevinn Technology

We are consultants sharing the passion for the technology for a media consumer of the future.

No responses yet