Announcing GameBench SDK for Unity and Game Performance Monitoring (GPM)
There are many ways to test a mobile game similar to 오즈포탈, from compatibility testing through to subjective testing (i.e., determining whether it’s fun or not). But one aspect of the QA process that is still quite new, and hence doesn’t always get the attention it deserves, is performance testing.
The need for performance testing has arisen in response to recent demand for more premium mobile game experiences — in other words, games that deliver high levels of visual quality and fluidity, and which increasingly do so in combination with other intensive tasks (such as AR or VR, physics simulation or sophisticated AI).
Given the newness of this discipline, I think it’s worth pinning down an essential checklist of six common pain-points which we frequently encounter here at GameBench, and which any meaningful performance test should take into account.
When a studio sets a target frame rate for a game’s animation, usually at either 30 or 60 frames per second (fps), it’s essential that this target matches the game’s genre and that it is achievable on popular devices.
If a game relies on physics simulation (like Angry Birds 2) or rapid reflexes (like Crossy Road), it should probably stick very closely to 60fps, because 30fps won’t deliver the necessary realism or responsiveness. If it’s a slower-paced game, perhaps a platformer like CounterSpy or a turn-based game like Hearthstone, then 30fps might be the more sensible target, since 60fps will consume extra system resources (in some examples we’ve seen, close to 100 percent more GPU usage and 30 percent more CPU) without significantly improving UX.
Whichever target you set, it’s worth bearing in mind that the generally-accepted industry standard is 30fps or higher, and even momentary slow-downs below this threshold are cause for concern — although some types of game are more forgiving than others.
Example: Lara Croft: GO may only be a puzzle game, but it’s excruciating at anything less than 20fps because Lara takes ages to complete the actions you’ve laid out for her. Some popular devices like the Google Nexus 7 and Tesco Hudl 2 render the game at just 14-18fps.
Practical tip: Profile the most popular game in the same category as yours and use it as a benchmark.
There’s little point in having a high average frame rate if it’s not stable. A game will feel better if it sticks closely to 30fps all the way through, rather than rendering some levels at 40fps and some at 20fps — and yet this pattern of instability is surprisingly common.
There are two easy ways to measure frame rate stability and it’s good to pay attention to both of them:
Additionally, it helps to look at “janks,” which describe significant differences between the draw times of individual frames. Janks can indicate areas of a game where there is a risk of user-observable micro-stutter, even while the average frame rate (which is recorded at the level of a whole second) is comfortably high.
Example: Vainglory is marketed as a 60fps game. On the LG G4 you’ll probably get a average frame rate of 50fps for a 15-minute session — which seems healthy enough. What’s actually happening, however, is that the game plays at 60fps at the start and then steadily drops to just 40fps, noticeably impacting on UX. This is captured by the low stability score of just 43 percent.
Practical tip: Play games for realistic session durations, which vary according to the genre of the game. Problems like temperature-triggered chipset throttling (when a processor slows itself down to avoid overheating) only tend to become apparent after the five-minute mark.
Whenever we find games with bad frame rates, the graphics processor (GPU) is often close to the cause of the bottleneck. On the other hand, we do sometimes encounter pure software optimisation issues where the GPU is barely being taxed and yet frame rate is still low.
In general, if you’re failing to meet frame rate targets and if peak GPU usage is higher than 90 percent, you can be pretty sure that device’s GPU is simply failing to keep up with the loads you’re putting on it — so you need to find a way to reduce this load somehow. If peak GPU usage is lower than 90 percent, look for other possible causes, such as software bugs.
Example: When the Samsung GS7 Edge launched, it could only play some Unity games at 15fps — while the non-Edge version of the phone (with an identical chipset) delivered the same games at 30fps. Despite its slowness, the GS7 Edge showed very low levels of GPU activity, indicating the presence a bug. As it turned out, this bug existed in the relationship between the Unity game engine and ARM’s Mali GPU drivers — Samsung promptly fixed it and now the affected games run perfectly.
Practical tip: When you see frame rate drops and janks clustered together on a session timeline, it’s right to suspect that there’s a genuine GPU bottleneck.
The vast majority of current mobile games use less than a third of available CPU capacity on average across a session — and anything much higher than this can increase the risks of temperature rises, excessive battery drain and performance throttling. On the other hand, quite a few games do cause short-lived surges in CPU usage which can cause temporary slowdowns in the UX, so a peak reading of anything over 90 percent can also be a red flag.
Example: Minecraft tends to use less than 25 percent of available CPU on average, but it is very CPU-intensive when loading up a new level, hitting 100 percent even high-powered devices like the OnePlus 3 and lengthening loading times significantly. Moreover, Minecraft sessions with temporary yet sustained bouts of heavy CPU usage often show serious frame rate drops at the same time.
Practical tip: CPU profiling should be done on release versions of games, not debug versions (where CPU usage will tend to be inflated). Also, middleware (or sub-optimal use of middleware) can often be a source of excessive and unnecessary CPU instructions that push average CPU usage too high.
A large memory footprint increases the chance of the game crashing or being shut down by the operating system. This is more commonly an issue on Android devices, where RAM requirements are invariably larger than on iOS, but fortunately newer Android devices tend to have larger RAM complements anyway. In general, you’re looking for a RAM footprint that is stable (i.e., not continually growing as the session progresses, such that longer sessions are more likely to crash) and appropriate to the list of devices with which your game claims to be compatible.
Example: Vainglory uses in excess of 1.1GB of RAM on some Android devices, increasing the likelihood of a crash. If a crash happens, players are penalised twice — they’re ejected from their current match and then their “karma” score is deducted on the assumption that they quit deliberately. Interestingly, there are some Vainglory sessions in our database that consume much less than 1GB on Android, and less than 200MB on iOS — suggesting that huge memory usage may not be strictly necessary to this game.
Practical tip: It’s probably okay to have large RAM usage if your target audience mostly uses high-end devices. Then again, if background processes reduce available RAM and cause a crash, they will still probably blame your game rather than the background process.
It’s worth looking at two types of power metric: user-perceived battery drain andinstantaneous system power. The former is reported by the OS and affects whether the user believes your game is a battery hog — based on the battery percentage reported to them in the top right corner of their device. The latter is reported directly by Snapdragon and Exynos chipsets and is more accurate.
On a smartphone, your game will be more prone to bad reviews if it drains people’s batteries at more than 25 percent per hour, or if average instantaneous power is above 2W — simply because this will mean its power appetite is greater than what mobile gamers have come to expect. If your game is designed for lengthy sessions, you need to be more efficient.
Example: Pokemon Go burns 2.6W on some devices, easily consuming 30 percent of battery capacity in an hour — and hence it is widely regarded as a battery hog (even with Battery Saver mode enabled).
Practical tip: Ensure that all device and game settings are matched when comparing power consumption across sessions — small differences can have a big impact on consumption.