Mean opinion score (MOS) is a measurement of the quality of experience, commonly used in profiling the quality of audio and video content. HeadSpin’s AI engine continuously measures video mean opinion score of mobile application video during a performance session test. The following is a screenshot of Video Quality MOS on HeadSpin’s waterfall UI timeline highlighted with a red outline.
Video contents are incorporated in different types of mobile applications: live streaming sports and events, live streaming news, explanation video, cutscene in a game, introduction/walkthrough video, live communication video, video as a file, etc. Poor video experience can negatively impact a user’s overall experience. It is crucial for mobile developers and QA teams to understand how well their video performs on real devices in real network conditions.
HeadSpin’s AI Engine MOS
HeadSpin’s mean opinion score ranges from 0 to 4, with 0 being very poor and 4 being excellent as shown in the following list.
Score range:
0 very poor
1 poor
2 fair
3 good
4 excellent
iPhone X Video Examples MOS: 3.54 [Youtube video]
Comments: Although the content of the Chinese character and QR character are not crystal clear, no visible blockiness can be observed.
MOS: 2 [NBA Video with blur and blockiness]
Comments: Notice that the player on the video screen capture is blocky. The scoreboard on the video is blurry.
MOS Score: 0.57 [Youtube Video with severe blockiness]
Comments: Notice that the video is very blocky where the face of the person is indistinguishable.
Improve your video and delight your customers
With real networks on real devices, many factors can affect the delivery of video content through mobile applications. The following list includes some of these different factors:
The device and OS of the mobile application
Bandwidth, latency, jitter packet losses of the network
Misconfiguration of the content delivery network (CDN)
Mobile application client issue playing the video
Corrupted video download
Using HeadSpin’s platform video mean opinion score (MOS) and network performance capture, developers and QA teams can resolve issues and improve mobile video viewing experience, which ultimately leads to delighted customers.
FAQs
1. Which key performance indicators (KPIs) for video are measured during the stream test?
Ans: Launch time, load time, stalled time, stalled count, total video playtime, video resolution, and video performance rating.
2. What is web real-time communication (WebRTC)?
Ans: Web real-time communication (WebRTC) is an open-source project that enables real-time voice, text, and video communications between web browsers and devices.
3. Which metrics are shown in the Waterfall UI of the HeadSpin Platform while testing a video?
Ans: The metrics available in the Waterfall UI are Blockiness, Blurriness, Contrast, Colorfulness, and Brightness, which show how the video quality evolves in the context of the network traffic, device metrics, and the video recorded from the device screen.
4. Why is the content delivery network (CDN) important for video streaming?
Ans: The content delivery network (CDN) is vital for video streaming because it is closer to viewers than the origin server. Serving the stream from the cache can cut down the round-trip time (RTT) to and from the origin server, which makes CDN significant in video streaming. Also, using a CDN can reduce the possibility that bandwidth issues will slow down the live stream for viewers.
Share this
Share this
Leverage the one-in-all AV platform to securely perform testing of audio and video quality, including DRM content.
This solution of HeadSpin enables:
Testing video apps like media, entertainment, gaming, and video conferencing
Running tests on OTT media devices
Testing DRM-protected content
Capturing QoE and streaming performance KPIs
Easily run and record tests with smart TV remote control