Performance IQ Logo

The road to console-class performance in cloud gaming

GameBench Staff

PIQ 22.09

How the Chrome OS and GameBench teams collaborated to test, benchmark and launch three revolutionary devices

On 11th October 2022, Google announced the launch of the world’s first cloud gaming laptops.

Built by Acer, ASUS and Lenovo, these devices not only further Chromebooks’ well-established reputation as fast, secure and easy-to-use. The market also expects their ultra-high performance, alongside ultra-low price points, to dramatically accelerate cloud gaming adoption.

The reaction from leading business and tech publications such as Forbes, Techcrunch, VentureBeat, PC World and The Gamer has been both immediate and highly positive.

So, precisely how do technological innovation and performance benchmarking enable a superb gamer experience? We go behind the scenes, to unpack the engineering, testing and analytics that have, for the very first time, moved cloud gaming within reach of console-class levels of performance.

(Readers new to Game Performance Management – or in need of a refresh – will find our concise two-page Reference Guide to All The Metrics That Matter a useful departure point.)

Delivering on the ambition

Here’s John Maletis, Vice President, ChromeOS Product, Engineering and UX:

“These laptops come loaded with features that are great for gaming, including 120Hz+ high resolution screens for crystal clear visuals, RGB gaming keyboards (on select models) with anti-ghosting capabilities for added speed and excitement, WiFi 6 or 6E for seamless connectivity and immersive audio to draw you into your game which may be just as thrilling as 겜블시티 가입 방법.

All cloud gaming Chromebooks have been independently tested and verified by leading game performance measurement platform, GameBench, to consistently deliver a smooth, responsive gaming experience with 120 frames per second and console-class input latency of under 85ms.”

The critical metrics of gamer experience

These points emphasize two performance metrics essential to game performance over networks: FPS and input latency. Why do these matter so much to the quality of gamer experience? What can go wrong? And therefore, what needs to go right?

Performance problems related to these two metrics are colloquially known – both within the industry and by experienced gamers – as Jitter and Jank. Jitter denotes negative impacts on video and audio quality, caused by delays in data packet arrival, commonly due to network congestion and/or route changes. Jank is an isolated, long pause between two frames, usually caused by dropped frames. (“Janky” is also used, more generally, by gamers to describe poor game quality.)

Frame rate

In performance quality terms, what the gamer sees on-screen while playing depends largely on frame rendering. This is measured using Median FPS, FPS Stability and Variability. Frame rate (FPS) expresses the number of frames shown in a given second. Tied to frame rate, frame time expresses the number of milliseconds to draw a single given frame. For example, a frame every 16.6667ms reflects 60 frames per second. The average jump between consecutive frame rate readings taken each second, frame rate variability, reflects the amount of variation in visual fluidity that a gamer experiences.

Input latency

The term ‘latency’ is used in a variety of contexts when measuring performance. The primary instances are:

Game latency – time taken for a game to process e.g. mouse input;
Display latency – time taken by the display to scan and render the output frame buffer;
Network latency – time taken for a network packet to get to the server and back;
End-to-end latency – time taken for an input action performed by the user to be reflected on the display.

GameBench is primarily concerned with end-to-end latency measurement. For example, the time between a player clicking the mouse button, and seeing the muzzle flash of an in-game gun. When a game’s latency is over a user-perceived threshold, or there are significant variations, this is referred to as ‘lag’.

Players refer to a game being ‘laggy’ when the following situations occur:

• Player movement is not smooth, and jumps back (or sometimes forward): this is also referred to as ‘rubberbanding’;

• There’s a noticeable delay in item interaction, i.e., between the user interacting with an object in the game, and an action occurring (e.g., opening a door);

• A player appears to get shot, after they have already moved to a safe area, e.g., behind a wall.

For this project, we analyzed the input latency under ideal network conditions, as we were focused here on profiling the devices, rather than the network.

Benchmarking the performance

Representative network conditions

Core to the overall approach has been a conscious effort to benchmark device performance in ideal conditions. The factors noted above can frequently degrade the gaming experience. To neutralize these variables, the GameBench team ensured:

• Good quality broadband (200 Mbps up and down);

• Testing during a range of off-peak times.

Game selection

For rigorous performance benchmarking, we also needed to select games that demand good responsiveness for a good gameplay experience. We chose the following:

Destiny 2 – the free-to-play online-only multiplayer first-person shooter video game, developed by Bungie;

Fortnite – the free-to-play Battle Royale game, offering modes for every level of game player, developed by Epic Games.

Gameplay automation

To reliably measure key gaming performance metrics, we automated gameplay for both titles. Automation enabled us to:

• Reduce human error;

• Provide a similar workload to all the test devices;

• Collect adequate performance data (and rule out potential outliers);

• Collect data during off-peak times, when game servers are likely to be less loaded.

Collection of performance data

The GameBench team had already developed ChromeOS support for its powerful ProNet range of tools. These were used, in conjunction with the gameplay automation, to record the key performance metrics detailed above.

Badging the experience

Having collected and analyzed all the test data, the next critical step was convert them into gaming experiences badges across two key areas:

• Responsiveness – which includes input latency;

• Fluidity – mainly derived from frame rate metrics.

Here’s Karthik Hariharakrishnan, GameBench founder and CTO:

“Using the methodology and tools outlined above, we concluded that these new cloud gaming Chromebooks achieved a “Great” badge for latency, and an “Ultra” badge for fluidity. Both were some of the best badges we have so far seen in our work on cloud gaming.

Perhaps most exciting, they were all able to either match or exceed the latency we measured on the XBox Series X (with a Xbox One Controller). Destiny 2 measured 80 – 85ms and Fortnite 95 – 100ms.”

Further reading

Readers looking to dig deeper into the specific performance management topics outlined above will find the following Performance IQ bulletins informative.

Lag and Online Multiplayer Gamer Experience
Network and Game Performance – Time To Close The Gap
Disruptors To Disrupted – Pressure On Networks Ramps Up
What Will It Take To Be The #1 Mobile Network For Cloud Gaming?
Network Performance and Gamer Experience (US edition)
Network Performance and Gamer Experience (UK edition)

You’ll also find streamable related GameBench PIQ webinars here.

Performance IQ by GameBench is a hassle-free, one-minute read that keeps you abreast of precisely what you need to know. Every two weeks, right to your inbox. Here’s a quick and easy sign up for your personal invite to the sharp end of great gamer experience.



And of course, get in touch anytime for an informal chat about your performance needs.

 

The Intelligence Behind Oustanding Gaming, Streaming and Network PerformanceThe intelligence behind outstanding performance