Can You See My Screen?

Dial into a seamless video and audio conference experience on mobile and web

Special thanks to William Maio, Abhishek Dankar, Rathna Govi and Sahil Kapur for contributing to this report.

This is a 2-part series of how the top video conferencing apps perform on iOS and Android devices around the world. We have covered iOS apps in this post, click here for the Android report!

In a typical week, our global teams at HeadSpin get to use all possible Video Conferencing tools across web and app, depending on which ones our customers are most comfortable with. There is no doubt that there has been a growing dependency on these forms of tools lately because everyone around the world is trying to figure out the new normal, namely #WFH, #StudyFromHome #WorkoutFromHome or just #StayHome. We decided to run the popular video conferencing apps (Zoom, Microsoft Teams, Bluejeans, Webex) through our HeadSpin platform and see who fared better from a  performance standpoint. 

These platforms just like many others are trying to grow and launch all kinds of capabilities to accommodate the rising needs of their consumers. They may not have been built for handling such massive demand in a short period of time from businesses of all shapes and sizes. Moreover, the actual experiences are vastly different between each other. 

Our goal with this post is not to point fingers at the apps or find issues. We work with some of the largest consumer and enterprise focused companies in the world, and our approach has always been to measure the user experience first. If companies find this useful, then we can help identify the root causes and work with the businesses and improve the user experience. 

First, we’ll look at what real users have been sharing in the last couple of months about their experience with Video Conferencing apps. Then, we’ll review the performance metrics measured for a typical meeting on these platforms such as time taken to join, screenshare etc. Finally, we have provided next steps to get more detailed performance insights and peer comparisons, on real devices, network conditions in locations of your choice. 

All these apps have strong ratings on the App Store, however here are some reviews that highlight concerns with core functionalities.

Zoom (4.6 stars, 157K ratings)       

“Terrible audio an video”
I use this app to do online dance classes, now that we are all quarantined, but I can’t make out what the teacher is saying or doing. The audio only works off and on and the video is always blurry.

Bluejeans (4.6 stars, 2.6K ratings)

“Can’t join the meeting from my phone. The app spins for 1-s then nothing happen

Microsoft Teams (4.8 stars, 362K ratings)

“Last update is a fail.”

It usually cannot start, just displaying a big logo. You cannot switch back to the app screen without killing it. In video conference, you cannot see shared screen. In such time, Microsoft should really check the quality of its apps!

Webex (4.3 stars, 98K ratings)

“Unreliable.”
Our company acquiredA WebEx license for firm employees to use among staff and with our clients. Our internal meetings in the late west coast afternoon work okay, not great, but mornings did not work at all. Also, video conference with clients did not work at all. Either they could not see any video, or they could see but not be seen, or they could not get sound….

Methodology

HeadSpin compared the latest versions of popular VC apps in the past week over WiFi for an iPhone XS Max located in Palo Alto, US and an iPhone 6 located in Tokyo, JP. Our tests were performed on the free versions of all the apps, except for Webex which was not free to create a meeting. No SDK is needed to run performance sessions. HeadSpin’s AI engine sifts through the network traffic, client side metrics, and videos of the test execution to find areas of poor user experience and performance bottlenecks. On the HeadSpin platform, recommendations are provided for every issue that surfaced. You can collect performance data through the HeadSpin Remote Control or run automation tests. 

Steps to recreate a typical user journey included: 

  1. Laptop starts the meeting and shares screen
  2. Both iPhones in the US and JP join the meeting and see the screen share
  3. Video Quality as measured by the Mean Opinion Score (MOS) for video streaming. We score this on a 1 to 5 scale.
  4. Bytes coming in for when the two devices were receiving screen share from the laptop. All tests were run on WiFi to simulate the real user experience. 
  5. Battery drain on each device was calculated for a 25 minute session 

The download speed tested on fast.com is shown below: 

Palo Alto

palo alto speed

Tokyo 

tokyo download speed

Insights

Join to First Frame

BlueJeans had the quickest time to join the meeting and receive screen share on both devices. First two attempts to join the call for Teams failed and it was lagging a little behind their peers, while WebEx had by far the worst performance. It took a full minute for the Japan device to join the meeting.
join to first frame
thumbsup
Bluejeans
thumb-down
Webex

Energy Impact

We ran a 25 minute meeting on the iPhone XS Max with the laptop sharing its screen during the meeting to measure battery drain. WebEx is once again the worst performing in this category, with a nearly 6% battery drain for a 25 minute session. Zoom is the best performing of the bunch.  

VC energy impact
thumbsup
Zoom
thumb-down
Webex

MOS Score — Video Quality

The MOS or Mean Opinion Score is a holistic subjective quality score that represents the perceptual quality of a video as perceived by an end user. HeadSpin’s mean opinion score ranges from 1 to 5, with 1 being very poor and 5 being excellent. The HeadSpin AI engine can measure how streaming video quality evolves over the course of a test and flag regions of poor user experience without having any reference to the source video content. Without AI, a team would either have to show a video to a pool of users and aggregate their feedback or curate high quality reference videos for each video to be evaluated. Not only are full reference video quality metrics expensive and difficult to maintain, but many rich media applications, such as live video and game streaming, have no reference to compare to. You can learn more about our reference-free MOS here

For the MOS score scale of 1 to 5 given to each frame of the video, we analyzed 30 seconds of video on both the Japan and US devices as the laptop was sharing its screen. Below are the summary statistics:

US and JP Device MOS score summary over 30 seconds while the laptop shared screen

MOS score US
MOS score Japan

Below are the same stats when the US device shared its screen, as viewed by the Japan iPhone:

Japan MOS during US screen share

Zoom has the highest variation in MOS score on iOS devices with the lowest average score. We have also experienced sync issues with audio and video, and many a times resort to dialing into a Zoom call with our mobile phones, and using the laptop to view the screenshare.

thumbsup
Teams
thumb-down
Zoom

Bytes In When Receiving Screenshare

bytes in screenshare

This chart looks at the bytes coming in during a 30 second period in which the device received the screen share from the laptop. It’s interesting to note that Zoom has the lowest values here, which is in line with what we’ve seen with their app so far. Zoom’s low bandwidth is likely one of the ways in which they optimize on energy consumption.

The above metrics are just a few of the key performance indicators (KPIs) we looked at. There are other comprehensive video and audio streaming KPIs that these apps care about as well as different devices, OS versions, network conditions, location, server side load testing and many others. The HeadSpin platform can automatically diagnose server-side issues that arise due to infrastructure deployment, poor performance, or API errors. These can be run on a 24/7 basis for thousands of daily tests on real devices around the world, to ensure your users continue to experience uninterrupted video conferencing. 

Reach out to us if you are interested in discussing more with our team or check out other comparison reports we’ve done here!

Accelerate your Automation Skills

Introducing HeadSpin University