This is a 2-part series of how the top video conferencing apps perform on iOS and Android devices around the world. We are covering Android apps in this post, click here for the iOS report. The HeadSpin Waterfall UI can evaluate performance across many types of devices, from desktop browsers to Amazon Fire Sticks.
Since we last wrote about the technical performance comparison on iOS devices, these companies have all released many more new features to keep up with user needs, as well as stay competitive. Late to the game, but even Google Meet decided to launch some much needed features.
At HeadSpin, we launched HeadSpin Pulse which is a live performance monitoring feed to measure launch time across essential services, apps and sites across iOS and Android and top US carriers. Check out the feed for the top Video Conferencing apps.
Our goal with this post is not to point fingers at any of the apps or find issues. We work with some of the largest consumer and enterprise focused companies in the world, and our approach has always been to measure the user experience first. If companies find this useful, then we can help identify the root causes and work with the businesses and improve the user experience.
First, we’ll look at what real users have been sharing in the last couple of months about their experience with Video Conferencing apps. Then, we’ll review the performance metrics measured for a typical meeting on these platforms such as time taken to join, screenshare etc. Finally, we have provided next steps to get more detailed performance insights and peer comparisons, on real devices, network conditions in locations of your choice.
We noticed that user reviews are more positive on iOS than Android.
Play Store Reviews
“It lags behind and the video is not as clear as when I go through the website. However, recently I have not been able to utilize the website and I’ve had to use the app. The website did an update that is apparently not compatible with my Pixel Slate. When I’ve contacted customer service I have been told to just keep uninstalling and reinstalling the app. I’ve done that at least a dozen times and it is still no better.”
Microsoft Teams (4.4 stars, 553K ratings)
“App is somewhat lagging with bugs..when join the team and opening other app through Microsoft teams inside…please fix it. Everyone facing these. Later after using this for a month there are some issues like 1. After we start recording a video the label is automatically invisible when I am trying to stop . 2. This buttons or selection bar will automatically closes when we touch to open that. 3. Check the issues regarding every buttons in the app pages ..mainly in video call and chatting page.”
Google Meet (3.8 stars, 50K ratings)
“Always disconnect all of sudden. The quality of the video also sometimes can be very bad eventho all of the person in the video conference have a high speed internet at the same time. Also, i had encountered problem where loud noise such as ping sound all of a sudden which forcing me to close the app. If they can fix these, maybe i can consider giving them 5 star as the app already has a good features.”
Bluejeans (4.4 stars, 5.2K ratings)
“Galaxy S10 user here, with a strong 5Ghz WiFi connection and Gigabit fiber. I could not maintain a stable connection, even with only one other participant. Video constantly froze, and occasionally had audio issues as well. We would have had a better experience with just a phone call. To add insult to injury, it drained a significant portion of my battery, while the phone was plugged in! If work requires this, don’t use the mobile version. Heck, just do a conference call.”
Webex (4.3 stars, 419K ratings)
“Really Bad app. Whenever I try to join a meeting, It always shows that meeting has not started. I had to miss important meetings. The audio is really wobbly and the video is not clear. Sometimes the audio ‘s does not come. And whenever I start a meeting,It does not even start. Please fix this issue.”
HeadSpin compared the latest versions of popular VC apps in the past week over WiFi for a Pixel 3XL located in Mountain View, US and a Galaxy S10 located in New Delhi, India. NO SDK is needed to run performance sessions. HeadSpin’s AI engine sifts through the network traffic, client side metrics, and videos of the test execution to find areas of poor user experience and performance bottlenecks. On the HeadSpin platform, recommendations are provided for every issue that surfaced. You can collect performance data through the HeadSpin Remote Control or run automation tests.
Steps to recreate a typical user journey include:
- Laptop starts the meeting and shares screen
- Both iPhones in the US and IN join the meeting and see the screen share
- Video Quality as measured by the Mean Opinion Score (MOS) for 30 secs of video streaming, with each device in the meeting (laptop, US device, IN device) sharing video. We score this on a 1 to 5 scale.
- Battery drain on was pulled for a 25 minute session on the Pixel 3XL
The download speed tested on fast.com is shown below:
Join to First Frame
Webex has the fastest join time for the US device, but the second-slowest for India. Zoom took the longest time by far to join on US. Meet is performing the best on the India device, yet doesn’t match the speed that WebEx and BlueJeans saw on US.
We ran a 25 minute meeting on a Pixel 3XL in Mountain View and measured battery drainage. In this test, the phone viewed a youtube video being shared from the laptop. We noticed that Meet had the best performance here.
MOS Score — Video Quality
The MOS or Mean Opinion Score is a holistic subjective quality score that represents the perceptual quality of a video as perceived by an end user. The HeadSpin AI engine measures how streaming video quality evolves over the course of a test and flag regions of poor user experience without having any reference to the source video content. Not only are full reference video quality metrics expensive and difficult to maintain, but many rich media applications, such as live video and game streaming, have no reference to compare to. You can learn more about our reference-free MOS here.
For the MOS score scale of 1 to 5 given to each frame of the video, we analyzed 30 seconds of video on both the India and US devices as the laptop was sharing its screen. Overall, we noticed the average MOS scores were better on iOS than Android.
US and IN Device MOS score summary over 30 seconds while the laptop shared screen
Meet is seeing the best performance here, in terms of average MOS. BlueJeans and WebEx have an average MOS below 4 on the US device, at 3.69 and 3.84, respectively.
Interestingly, BlueJeans is showing the highest average MOS for the India device. Google Meet has some room for improvement, despite being the best on US. Zoom and Teams did not see much change in their performance.
MOS Score Summary as detected on India device when US device was screen sharing
Due to a bug on Microsoft Teams, we were unable to view the screen share in landscape mode, and as such could not capture MOS accurately. Among the others, Meet saw the best average performance, though BlueJeans showed similar performance with less variation.
MOS Score Summary as detected on US device when India device was screen sharing
Here, all devices are competitive, with Meet and Zoom slightly edging out the others. WebEx is in last, but is fairly competitive.
Overall, Google Meet delivered the best performance. The only time where Google Meet struggled compared to peers was when viewing the laptop share from the India device. Though Zoom has become synonymous with video conferencing over the past few months, it saw poor user experience in time to join for the US device. WebEx has the most room for improvement for video quality, and BlueJeans for energy consumption. The bug noticed on Teams made it so that even in landscape mode, the portrait version of screenshare was being presented.
The above metrics are just a few of the key performance indicators (KPIs) we looked at. There are other comprehensive video and audio streaming KPIs that these apps care about as well as different devices, OS versions, network conditions, location, server side load testing and many others. The HeadSpin platform can automatically diagnose server-side issues that arise due to infrastructure deployment, poor performance, or API errors. These can be run on a 24/7 basis for thousands of daily tests on real devices around the world, to ensure your users continue to experience uninterrupted video conferencing.
Reach out to us if you are interested in discussing more with our team!