So far in my series, we’ve looked at mobile performance testing in general, as well as free tools for mobile performance testing. Now it’s time to discuss paid tools. Obviously, Appium is free and open source, and it’s nice to focus on and promote free tools. But at the end of the day, sometimes paid tools and services can be valuable, especially in a business setting where the budget is less important than the end result. I want to make sure you have some decent and reliable information about the whole ecosystem, which includes paid tools, so will from time to time let you know my thoughts on this as well!
Disclaimers: This is not intended to be completely comprehensive, and even for the services I cover, I will not attempt to give a full overview of all their features. The point here is to drill down specifically into performance testing options. Some services might not be very complete when it comes to their performance testing offering, but that doesn’t mean they don’t have their strong points in other areas. For each tool, I’ll also note whether I’ve personally used it (and am therefore speaking from experience) or whether I’m merely using their publicly-available marketing information or docs as research. Finally, I’ll note for transparency that HeadSpin is an Appium Pro sponsor, and in the past I worked for Sauce Labs.
Most interesting for us as Appium users will be services that allow the collection of performance data at the same time as you run your Appium tests. This is in my opinion the ‘holy grail’ of performance testing–when you don’t have to do anything in particular to get your performance test data! In my research I found four companies that in one way or another touch on some of the performance testing topics we’ve discussed as part of their Appium-related products.
I’ve used this service.
HeadSpin is the most performance-focused of all the service providers I’ve seen. Indeed, they began life with performance testing as their sole mission, so this is not surprising. Along the way, they added support for Appium and have been heavy contributors to Appium itself in recent years. HeadSpin supports real iOS and Android devices in various geographical locations around the world, connected to a variety of networks.
When you run any Appium test on HeadSpin, you get access to a report that looks something like the one here. Basically, HeadSpin tracks a ton of different metrics in what they call a performance session report. The linked doc highlights some of the metrics captured:
network traffic (in HAR and PCAP format), device metrics including CPU, memory, battery, signal strength, screen rotation, frame rate, and others, device logs, client-side method profiling, and contextual information such as Appium commands issued during the test, what app was in the foreground, and other contextual metadata.
On top of this, they try to show warnings or make suggestions based on industry-wide targets for various metrics. And this is all without you have to instrument your app or make specific commands to retrieve specific metrics. HeadSpin covers the most metrics out of any product or company I’m familiar with, including really difficult-to-measure quantities like the MOS (mean opinion score) of a video as a proxy for video quality.
One other feature useful for performance testing is what they call the ‘HeadSpin bridge’, which allows bridging remote devices to your local machine via their
hs CLI tool. Concretely, this gives you the ability to run
adb on the remote device as if it were local, so any Android-related performance testing tools that work on a local device will also work on their cloud devices, for example the CPU Profiler. The bridge works similarly for iOS, allowing step debugging or Instruments profiling on remote devices, as if they were local.
In terms of network conditioning, HeadSpin offers a number of options that can be set on a per-test basis, including network shaping, DNS spoofing, header injection (which is super useful for tracing network calls all the way through backend infrastructure), etc… This is in addition to requesting devices that exist in certain geographical regions.
Finally, it’s worth pointing out that HeadSpin tries to make it easy for teams to use the performance data captured on their service by spinning up a custom “Replica DB” for each team, where performance data is automatically written. This “open data” approach means it’s easy to build custom dashboards, etc…, on top of the data without having to pre-process it or host it yourself.
Accelerate your Automation Skills
Introducing HeadSpin University