For many years, the computing power of the PC depended on the investment made in it. The more expensive it was, the higher performance it had so that it could run more demanding hardware games. But as the game system requirements increase yearly, such hardware investment also depreciates rapidly. Moreover, hardware is in short supply, which is not getting better. But when it becomes better, it costs a lot.
But there is a solution. You don't have to think about how weak or powerful your PC is with cloud gaming. You choose a game from the service and run it. The commands from your controller are sent to the server, computed into the game, converted into video and audio streams, and then sent back.
Seems like too much work, too long, and too complicated? Let's find out.
When analyzing this topic, I used the “Cloud Gaming” and “Cloud gaming don’t mean a thing if you ain’t got that ping” articles as inspiration.
When playing games on PCs and consoles, video artifacts in the image and input latency from pressing a button to executing a command on the screen are kept to a minimum. Video quality is directly related to your hardware's processing power and the capabilities of your game engine. Latency mainly depends on the game's frame rate, the way the game engine processes the logic, and the signal processing time of the monitor.
It is difficult to determine the total latency value for all games and hardware configurations. However, I will make a few generalizations to understand better the difference between local gaming and gaming in the cloud for ease of comparison.
One of the main factors affecting latency is the number of frames per second at which the game is rendered. When 60 FPS, you see each frame for 16.7 ms.
So, what I see here:
The user presses a button on the controller, and then a signal is sent to the console/PC. 10 ms is a rough number because the signal delay depends on the specific controller, whether wired or wireless, etc.
The game logic is first calculated and then transmitted to the display based on the received data. An optimized game engine for modern games usually takes 3 frames.
The average input delay for the display is about 30 ms.
So, your gaming device, whether it's a PC or console, will take 90 ms to update to your input command. That's 5 frames or almost a tenth of a second.
Input latency in cloud gaming is different because many more processes are considered. It's not so simple anymore:
If you add up all the components, the delay will be about 130 ms.
And now, let’s discuss the results of a “Cloud Gaming: Architecture and Performance” study, the authors of which argue that the maximum interaction delay for many games should be no more than 200 ms, and for games that require a quick response - no more than 100 ms.
Some conclusions from this study:
Shooters and fighting games should have a latency of less than 100 ms, as they are particularly demanding in terms of responsiveness.
Role-playing games, such as World of Warcraft, should have a latency of no more than 500ms. They are no longer sensitive to responsiveness but still require some responsiveness to game world events and reliable actions, such as healing a character or casting spells.
RTSs are even more forgiving of latency and sometimes allow up to 1000 ms. It is no longer crucial for such players to perform all their actions immediately. For example, players can do building without constantly refreshing the game.
Considering this, an increase in latency by 1.5 times compared to the local computer does not look critical. But then why is the first impression of cloud gaming usually negative?
And it's most often about the user's Internet connection. And the fact that your 100 Mbps is not necessarily 100 Mbps.
Next, let’s look at what can prevent cloud gamers from achieving optimal input latency.
Generally speaking, default Internet and IP routing do not guarantee reliable data delivery or quality of service. They also have some other limitations that make maintaining a low ping at all times a challenge.
There are many ways in which latency can occur. The small size of game packets (55 bytes) versus standard Internet packets (1,500) causes IP routers to reduce their processing overhead by 27. The smaller packet size also drops more packets because buffer limits are usually set based on the number of packets, not their size.
Other latency issues arise from the way IP networks calculate packet routing. The primary routing protocol on the Internet (BGP) can create circular paths across the network with more hops than necessary and create different paths for incoming and outgoing traffic. In addition, when it comes to peering or transferring traffic to other networks, BGP does not consider the bandwidth of the receiving network, real-time performance, or network performance. It may choose different peering points for incoming and outgoing traffic. This can lead to different performance results and low ping rates, especially for gamers connected to networks with different providers.
Peering requires an understanding of the end-to-end paths from the host server to the gamers who use it. Thus, real-time network analytics are required to provide the best gaming experience. By understanding gaming traffic patterns and how traffic flows across the Internet and other networks, gaming companies can optimize end-to-end traffic flows between their servers, content distribution networks (CDNs), the Internet, and end-users.
Increasingly, cloud gaming companies are building their backbone networks to connect edge and core data centers that host game servers. This allows for more granular end-to-end control, ensuring the best performance for gaming applications. The Internet then acts only as the "last mile" for gamers.
This interconnectivity structure between data centers typically consists of routers and fiber trunks. The network's IP and optical layers must be coordinated to ensure that these connections are deterministic. A central software controller performs this role in modern software-defined networks (SDN). Analytics are also built into the system. So unlike BGP, we know the end-to-end route's performance, including peering points. The controller configures the routers and optical links according to specific performance policies and SLAs.
What happens to the signal on the way from the server to our homes, we have roughly figured out. Now let's get back to the more mundane things - the Internet in our apartment.
The Internet speed indicated in the service contract is the maximum possible speed that the operator allocates per channel. It will be relevant for you if you are the only one in the whole house who uses the Internet, and there are no obstacles in the way of the signal. The situation seems pretty utopian. You can measure the actual speed of your Internet, for example, using speedtest.net.
However, the system requirements for the Internet connection of GFN from 15 Mbit/s for 720p at 60 FPS and from 25 Mbit/s for 1080p at 60 FPS. As for MY.GAMES Cloud - 10 Mbit/s to run games in 720p at 30 FPS, 25 Mbit/s at 120 FPS.
In an ideal world where Internet speeds always meet our needs, an uncompressed, high bitrate signal would be transmitted to the user, providing an ultra-high-quality image indistinguishable from the one played on local equipment. But due to bandwidth constraints around the world, bitrates need to be lowered while maintaining low latency and high video quality. As you have probably already realized, this is not an easy task.
Keeping latency to a minimum introduces restrictions on video streams. In that case, B-frames cannot or should not be used because then the delay would increase significantly. Other parts of the stream should not be complicated in any way, so that encoder and decoder can encode and decode high frame rate and resolution in real-time.
With these limitations, increasing the bitrate solves the quality problem. However, it increases the delivery problem because most homes worldwide still don't have access to cheap, stable Internet connections that have high bandwidth. And it's not even about Full HD and 7-8 Mbps. It's about 30+ Mbps. And that number only increases with higher frame rates and resolutions.
Another way to solve the problem is to use a more efficient codec. Today H.264 is the most common codec. It is optimal to use in cloud gaming services that do not depend on their dedicated equipment for signal decoding. This is because most modern devices are equipped with chips capable of decoding certain H.264 profiles on the fly. However, suppose the user has a device with newer chips that support the decoding of higher performance codecs (such as H.265 HEVC). In that case, quality can be significantly improved using the same bandwidth.
From the exact bandwidth requirements, we can guess that if the home Internet is not always suitable for gaming via the cloud, the mobile is not suitable at all due to technical features. However, we are waiting for the spread of 5G, as it is expected to improve the quality of cloud gaming.
Internet signal inevitably has a loss, including on the "last mile" - already inside your apartment or house.
The optimal connection, which removes some of the loss, is directly through an Ethernet cable to your PC, which you use alone. However, not all laptop models have such a connector at all. In general, WiFi-router has long been a standard solution for home and office, which allows you to connect to the same network from multiple devices and at the same time not get tangled in the wires.
However, there are a few nuances when using WiFi routers:
As we've said before, the more people connected to the network, the slower a particular person's connection speed will be.
Most Internet routers operate at 2.4 GHz, the same frequency as Bluetooth devices like computer controllers and headsets. Even a regular microwave affects this frequency. The way out could be to connect to WiFi on other frequencies - such as 5 GHz. But not all devices support it, so you must first make sure that your PC and your router can work with it.
The signal from the router inevitably "fades" when moving around the apartment and colliding with various obstacles: ceilings, walls, doors, and furniture. The signal will be much better if your PC is within the "line of sight" of the router.
Connection over Ethernet cable is better than wireless connection with WiFi.
So, the high ping and high latency are due to poor connection quality between the PC and the game servers. And while some of the problems are solved only on the side of the Internet service provider and the cloud gaming service, there are still some things that the user can do to make the game easier.
For example, if the connection leaves much to be desired, you can reset the router or reconnect the cable through which you connect to the Internet. In the case of a router, you should also temporarily disconnect other devices from the network: phones, TV, "smart" equipment, and devices connected via Bluetooth. Finally, you should disable file downloads, close unnecessary tabs in the browser, and generally reduce the load on the operating system as much as possible, which can reduce network bandwidth. Remember that all this takes away some speed from cloud gameplay.
Reducing input latency can be partially eliminated by increasing the frame rate at which the game is running. Naturally, this will also increase the encoding and decoding requirements of the streams and the need for even higher bandwidth as more data becomes necessary.
Buying a monitor with faster signal processing or "game mode" will also reduce the problem, saving a few milliseconds of input delay. But keep in mind that "game mode" will minimize picture quality as signal processing will be done less frequently.
So what about cloud gaming - is it that bad with latency? It all depends on who you ask. But its future definitely looks promising, and here's why.
First, most gamers choose games that usually don't even notice the extra input lag. As for more demanding projects (competitive or with modern realistic graphics), cloud gaming still has some way to go.
However, the share of requests related to input delays has decreased noticeably. This means that the quality of the Internet is growing, and with its further growth and the development of 5G, there will be even fewer problems with the network.
The second factor is server geolocation. The services themselves decide it.