HeadSpin Documentation
Documentation

Waterfall UI

Overview

The Waterfall UI is part of HeadSpin proprietary data capture software toolset, designed to help you dig deep into a wide-ranging scope of data from your testing efforts and synthesize that information into actionable development goals and tasks. Our comprehensive suite of data metrics will help you to simply and easily set baseline numbers, refine your tests, and most importantly, identify and flag problem areas and bottlenecks in your applications under development.

You can think of the data found in the Waterfall UI as a living document–you can interact with the data, add data to it, and ask questions of or extrapolate from. All of these things can be done with the waterfall data at any time once a session capture has been completed.

Getting Started Using the Waterfall UI

The Waterfall UI gathers and prepares data based off of captured sessions, or performance sessions, that record your testing experience. For more information on the process of capturing a session, visit our documentation on Remote Functional Testing here. To give you a basic understanding for the purposes of this document, to a capture a session you must have three things:

  1. A HeadSpin user account
  2. Access to at least one device
  3. A version of your app under test

From the Devices List in the Remote Control tab, click the Start button beside your target device to begin a session on it. To start recording/capturing the session, click the Start Recording Session button in the upper-right region of the remote control window. Depending on what kind of test you are conducting, you may want to ensure your app under test is already installed, or you may want to install it using the Remote Control toolkit. For more info on that, see our App Management documentation.

Once you have completed a session capture, HeadSpin will automatically present you with a quick navigational pop-up that can take you to either the Waterfall UI or Burst UI pages for your session.

You can also access the Waterfall UI from the Performance Sessions tab (also called the ‘P’ tab). Clicking this tab will immediately bring you to the Performance Sessions List, which is a list of all sessions undertaken within an organization. You can search for specific sessions within a time frame or tagged with a specific tag key, and the list view also displays some basic information about each session, such as its location, the device used during the session, and the number of issues detected during the session. To click on a session’s Waterfall UI content, click the Waterfall UI button beside your selected Performance Session (the button is located under the Data Analysis UI heading.)

Icon

Note that anything you can do with the user interface you can also do using our API. You can learn more about that here.

The Waterfall UI essentially breaks down into seven panes or sections of the screen. In approximate order from from left to right and top to bottom, those panes include the following:

Home

We will go over all of these sections in detail below.

Waterfall UI Areas

Title Pane (information palette)

Title Pane

The title pane appears in the upper-left corner of the Waterfall UI window. This is where you can find general information about the session you are viewing. In this pane you can modify the title and description of your session (this will also be reflected in the Session List.) You can add the session to a user flow, which is a manual or automated user journey (or group of journeys) focusing on a particular user intention or facet of user experience. Since user flows are designed to help you group and follow tests performed multiple times, they can be hugely helpful for tracking and refining issues for improved ROIs. You can add a session to an existing user flow or create a new one from the User Flow pop-up pane.

User flow

If your session is included in a user flow the name of its user flow will appear here; you can also assign it a status to reflect its position in your user flow’s collective sessions–whether the test in this session passed, failed, or if this session is excluded from conclusions regarding your test’s pass/fail requirements.

The title pane also includes information about the device used during the session, including the device’s location, its unique ID (comprised of the device ID and host ID), the network connection type and provider if one is attached to the device, as well as any tags added to the session. From here you can download various collections of data (this includes primary data as well as timeseries data, which will be explained below). The file type in which the available data will be downloaded is included in the filename displayed in the list.

You can also add tags to a session in this pane. The tags are custom-designed by you and must contain a key and a value to be valid. Tags will also appear within the session’s information listed in the Sessions List.

You may also see an ‘Analysis in Progress’ indicator, particularly if you navigated to the Waterfall UI immediately after completing a session recording that contains significant video data. This indicator lets you know that your session data is still processing and uploading to the UI, although you can still interact with the data in the Waterfall UI while this message is displayed.

Metrics Pane/Graphs (metrics graphs)

metrics-pane

The metrics pane takes up most of the upper-center of the Waterfall UI; this area displays breakdowns of network traffic, specifically HTTP traffic, in four categories. These categories are shown as pie charts, and include the following:

  • Session-wide Metrics - This chart includes all network traffic within the session.
  • Domain Metrics - This chart covers metrics for each domain within the session; depending on the steps of your test, you may have only one or two domains or several. For example, traveling to several different web pages may mean lots of different domain interactions. The dropdown window in the upper-left corner of this metrics section allows you to select which domain’s data/charts are on display.
  • Host Metrics - This chart includes data for each host or subdomain in the session. The dropdown window in the upper-left corner of this metrics section allows you to select which host’s data/charts are on display.
  • Burst Metrics - A burst is a network transmission concept: during a burst, network activity is acutely increased within a small window of time. For mobile traffic, there are frequent network bursts as several network connections may be established and network calls made just to perform a single step in using an application. Often with mobile apps there will be at least one network burst to establish a connection with a PoP or WiFi signal, and depending on the network usage of your app there may be additional bursts. This is a common area of interest for mobile app issues, so performance optimization - such as decreasing battery drain or network lag on an app making multiple large-content network transactions - could be highly improved using this data.

All categories will display the traffic content chart by default; this chart describes what amount of the total data covered by the chart belongs to various types of data (for example how much is JSON, how much is image rendering, how much is packet header, etc.) You can click the ‘More’ tab at the bottom of the pane to expand all metric categories, revealing additional pie charts containing more specific data. To return the metrics pane to its default display, click the ‘Less’ button (this replaces the ‘More’ button when the pie charts are expanded.) Also note the question mark icon in the upper-right corner of the metrics pane; click this to gain more in-depth explanations and helpful tips regarding the metrics pane and its contents.

Screen Recording Pane

screen-recording-pane

The screen recording timeline is located in the upper-right area of the Waterfall UI window. This window will display a second-by-second recording of what happened on the device screen during your session. Its contents are precisely tied to the timelines in the Timeseries Pane, and the screen recording updates in real time to synchronize with the position/highlighted time of your cursor on the timeline as you scroll through any given metric’s timeline.

For your convenience, this pane can be moved in various ways. You can pop it out of the UI to give it its own window, which can be resized or rotated (often useful for apps transitioning between portrait or landscape mode in a mobile device). If you pop the screen recording back into the UI it will be fixed to the upper-right corner again.

7pop-out
An example how the screen appears when the screen recording pane is popped out.

Timeseries Pane

timeseries

The timeseries pane is, arguably, the most important and dense section of the Waterfall UI. The goal of this pane is to illuminate, in an interactive visual form, how a variety of metrics are evolving throughout the entirety of your test session. The timeseries section displays these metrics as a collection of individual timelines, each tracking a different metric. There are quite a number of metrics tracked in the timeseries pane; you can find more information on each of them in the Timeseries Metrics section. Note that all timeseries data is rendered on a linear scale.

Which metrics appear within your session analysis is largely dependent on how your organization's capture configuration is set up. Some metrics require permission to be enabled, which can only be done by an admin for your org. Others require a specific physical configuration of a device or may not be available for your selected devices (i.e., setting up audio on a device, limitations due to mobile OS version, etc.). The Timeseries Metrics section is an overview rather than an exhuastive list; if there is a metric you wish to measure in your captured session that you do not see listed here or in your Waterfall UI, we encourage you to work with your organization's admins and your HeadSpin administrator to customize your options in the way that works best for you.

Data for each timeline is displayed within a bar, with the instances of the measured metric appearing in line form within that bar. Interpreting the data contained within a timeline involves scrolling along the data line, or hovering your cursor over a particular time point in the timeline to view what (if any) instances of a metric occurred at that point in time. On the right side of each timeline, the maximum value possible for a given metric is displayed, as well as the units of measurement for said metric.

The timeseries pane has a highly customizable view; click the ‘Collapse Timeseries’ button to shrink all timelines down (a timeline measurement bar will remain to allow tracking of the labels data, described below, but all timelines will disappear). Click the same button again, now labeled ‘Time Series Available’, to restore all timelines. You can also click the dropdown arrows contained within the Time Series Available button to select specific timelines to display. You can also click the white ‘X’ icon in the upper-right corner of the details pane to shift the details and screen recording panes and minimize them on the side of your screen, providing more space for the timeseries section to expand.

collapse
A view of the default appearance of the timeseries pane.
select-timeseries
A view of the timeseries pane collapsed, revealing the session labels/network traffic timelines. This image also shows the dropdown menu for selecting specific timeseries for display.

You can restore the details and screen recording panes by clicking on the small slice of them left in the minimized view. You can also modify your view of a specific timeline by utilizing the zoom controls; zoom in or zoom out using the zoom buttons in the toolbar at the top of the timeseries pane, or use the fit button to reset the timeline to a view of the entire session (starting from 0 seconds on the left and going through the end of session on the right). You can also double-click on the timeline to zoom in. If your hardware allows for it (such as with a laptop trackpad), you can scroll horizontally through a zoomed in timeline to see the entire timeline’s contents, despite the zoomed in view. You can also hold the Shift key and then click and drag to highlight a specific region of a timeline and zoom in to it.

Within any timeline, you may see an orange highlight located above the data line in a timeline.

impact

These highlights show an impact incident, a HeadSpin-generated flag for a possible issue you may want to investigate in your session. Impact time is a common metric we use across all issues; we project them onto session timelines to show you where and for how long a possible issue occurred. Impact time is always some portion of session time; the longer an issue persists, the bigger the impact time will be (and therefore the larger the orange highlight). An impact flag will always have an associated issue card, found in the issue cards pane to the left; you can identify the issue card by matching its title to the timeline in which you found the impact flag.

impact-highlight

Note that the timeseries pane also has a question mark icon in the upper-right corner that will take you to a help window with additional details about the pane’s contents.

Session Labels Pane/Network Traffic Timeline

labels

The sessions labels and network traffic timelines may at first appear to be a continuation of the timeseries pane; in a way that’s true! Labels and network traffic still utilize the timeline measurement and tie into the session recording screen view, but this section provides different kinds of data. (Note: the easiest way to view this section is to collapse the timeseries pane above it.)

Session labels are intended to provide additional context for events in your session’s timeline. During session capture analysis, HeadSpin automatically parses some events and creates session labels to make it easier for you to locate and identify certain events. Labeled events will appear in one of two ways on the timeline: instantaneous, which will be represented by a dot for the specific moment of the event, and continuous which looks like a horizontal bar extending over the duration of the event. Common session labels include foreground activity, foreground application, and log warnings/exceptions. HeadSpin will also track and label sent and received Appium/Selenium commands so you can see exactly where you are in your automation down to the second.

You can also create your own labels by clicking and dragging on a section of the timeline area of the session labels section; this will trigger the Create Label pop-up pane. The Create Label window will automatically fill in any information it can from the section of timeline you highlighted, but you can fill in or change the information yourself to create your own custom label. To create a label that will describe an instantaneous event, use the exact same timestamp value in both the start and end time boxes. Make sure to click the ‘Submit’ button at the bottom of the Create Label window for your custom label to be saved.

create-label

The network traffic timeline is beneath the session labels list, and it is exactly like it sounds: it provides a timeline graph of the network traffic during the session, with each event listed and labeled separately (you may have to zoom in quite far to see the different sections of a single network transmission.) Traffic will be denoted using colored bars; these colors correspond to the colors used in the network traffic pie charts in the metrics pane discussed earlier. These blocks of color are grouped into connections, which are organized by hostname. Network data is captured by HeadSpin using a man-in-the-middle proxy to capture DNS, TCP, TLS, and HTTP traffic, along with websocket and UDP traffic. Note that UDP traffic, due to its nature and volume, is not displayed in the Waterfall UI but can be seen in a session’s PCAP file, which can be accessed or downloaded using our API.

Within the network traffic timeline you can highlight all traffic from a particular URL, port or DNS lookup by clicking on an event. Any other events that share a URL, port or domain will also be highlighted, while the rest of the network traffic will be faded out. You can exit any view of specific events by clicking the white ‘X’ icon in the grey box to the upper-right of the session labels/network timeline pane which says ‘Currently viewing: [event name]’.

network
A view of the default appearance of the network traffic pane.
selected-network
A view of the network traffic pane with a specific DNS call selected, with similar domains also isolated and highlighted.

Any session label or network event, when clicked, will load its details into the details pane, located in the bottom-right corner of the Waterfall UI.

network-highlighted

In the zoom toolbar, there are two buttons specifically for use in this area: Group by Category and Group by Host, which are related to the session labels and network traffic timelines respectively. Both these buttons are activated by default. Clicking the group by category button will deselect it and collapse the labels for session events; clicking the group by host button will do the same for network event labels. The events will remain on the timeline and can still be interacted with (i.e., their details will appear in the details pane when clicked on), but the labels and sorting of the events will be removed.

before-category
A view of the default appearance of the session labels and network traffic pane, with the session's grouped labels circled.
after-category.
A view of the session labels pane after the group by category option is deselected.

Details Pane

details

The details pane, as previously stated, is located in the bottom-right corner of the Waterfall UI, beneath the screen recording pane. In this pane, specific details for any individual event in your session timeline are displayed. Details for any event are loaded into this pane in a self-contained box format, rather like an issue card. If you click an event and load its details, then click a different event, the first event’s details will simply be pushed down the details pane’s column, rather than replaced entirely. In this way you can have multiple events’ details loaded simultaneously and you can view them simply by scrolling up and down the details pane. An event can be removed from the column by clicking the ‘X’ icon in the upper-right corner of an event’s details box. You can also click the ‘Clear’ button to remove all details cards from the pane. If you hover over one event’s details, the associated event will be highlighted in the timeline display.

details-hover

The details pane includes a variety of information depending on the kind of event currently selected. Network traffic details may include things like the connection type, the number of calls and responses, and the DNS lookup time. Session label details may include things like the event type, category, and start time. Every session label includes a unique Label ID in its details card. This Label ID can be used to query against the data of this specific event using API or other tools to gain further insights.

details-compare

Issue Cards Pane

issues-pane

The final information pane in the Waterfall UI is the issue cards or issues pane, located in the bottom-left section of the UI beneath the title pane (you may have to scroll down on your screen to see it.) Issue cards are a special value analysis HeadSpin provides, applying not only metric data but also user experience and performance issue research to identify known UX issues for early issue resolutions.

Issues are always based on some performance benchmark expectation. Our benchmark standards are set using studies published by major contributing companies in the IT research space such as Google. Specific benchmarks will vary depending on the issue and its average metrics, but the bottom line will always be a calculation of how much additional time was added to the session/user’s experience, prolonging the app’s use in a negative way.

The Issue pane displays your session’s impact score at the very top of the pane. This score is calculated using the session’s total impact time and the session’s total duration. Beneath the overall score is a card for each potential issue that was flagged during your session. Issue cards are prioritized according to impact time, with the highest impact time at the top and issues listed in descending order. Each card will display the total amount of time this specific issue occurred during the session, the number of times it occurred, and a fast navigation bar indicating each separate instance of the issue, with instances of the highest impact time listed first (to the left) and descending along the bar to the right. Clicking on a specific issue instance within the navigation bar will select/highlight it within the timeline section.

issue-highlight

Each issue card contains a description of the issue and suggestions for how to debug the issue. Clicking on the issue card will expand it, revealing the View Table and .csv buttons. The former will trigger a pop-up window containing a table of raw data for that issue. The latter will automatically begin a download of the issue card’s data in .csv format.

issue-buttons
table

It’s important to note that not all instances within an issue card may indicate a true problem with your app’s user experience. These are suggestions and candidates for possible issues HeadSpin identified using common metric benchmarks; depending on your app’s design and purpose, these benchmarks may not be appropriate for your application’s troubleshooting goals. Keep this in mind while you debug!

Also remember that the issue card pane also has a question mark icon in the upper-right corner. Click it to find more information about the issue card pane.

Timeseries Metrics

The following is a list of the metrics viewable in the timeseries pane, in order of appearance, along with a brief description of the metric.

Metric Description
Impact Time The estimated performance impact of active issue cards affecting a given time.
Impact Count The number of active HeadSpin issue cards affecting a given time.
Video Quality MOS Video Quality MOS: A mean opinion score quantifying subjective perception of video content as a single number, computed by averaging the subjective score of a large number of users (accounting for user bias.)
Blurriness Reference-free measure of blurriness in video frames, measuring the spread of an image’s edges.
Blockiness Reference-free measure of blockiness in video frames, measuring the intensity of the periodic appearance of block structure in an image.
Contrast Reference-free measure of contrast in video frames. This metric measures the perceptual level of contrast of a frame. Variation in the level of contrast can indicate issues in the original video recording.
Brightness Reference-free measure of brightness in video frames, measuring the perceptual level of brightness of the frame. Low levels of brightness, particularly when coupled with low levels of contrast, can affect the perception of the video quality.
Colorfulness Reference-free measure of colorfulness or "chroma" in video frames. This metric measures the amount of color usage in a frame.
Downsampling Index Reference-free measure of downsampling in video frames. This metric measures the fraction of pixels used to convey information on small scales. Video that has been degraded (e.g., due to downsampling) will in general have a higher Downsampling Index, while high fidelity video will have a low Downsampling Index.
Page Content The amount of information displayed on the page. At any point in time, this is the average information entropy of the image intensity in a video frame in bytes.
Frame Rate The frame rate as measured from the device.
Screen Change The amount of visual change in between video frames.
Screen Rotation The orientation of the screen.
Audio Volume (Mono) Root-mean-square energy of the audio signal as a measure of volume. Applicable only if the session video contains one audio channel (mono).
Audio Volume (Left) Root-mean-square energy of the audio signal in the left channel as a measure of volume. Applicable only if the session video contains two audio channels (stereo).
Audio Volume (Right) Root-mean-square energy of the audio signal in the right channel as a measure of volume. Applicable only if the session video contains two audio channels (stereo).
Audio Momentary Loudness Perceptual loudness of the audio signal as defined in ITU VS.1770-4. Applicable only if the session video contains audio (mono or stereo). Momentary Loudness (in LUFS) provides a loudness measurement for every sliding window of 400 ms.
Concurrency The number of simultaneous request/response blocks.
Connections The number of open TCP connections.
Download Speed The total rate of data transfer from the cell network to the cell PoP (point of presence). This metric measures only the download speeds of receive sections in concurrent request/response blocks.
HTTP Throughput The total rate of data transfer to and from the cell PoP.
Network In Bytes The number of bytes received from the network.
Network Out Bytes The number of bytes sent out on the network.
Network In Packets The number of packets received from the network.
Network Out Packets The number of packets sent out on the network.
Network In Total Bytes Cumulative number of bytes received from the network.
Network Out Total Bytes Cumulative number of bytes sent out on the network.
Application CPU Usage The CPU utilization as a percentage corresponding to the application under test on the host device. The value is normalized by the number of CPU threads that can run the program on the host device. 100% indicates the application processes occupy all CPU resources for the application and its children processes.
Net CPU The net CPU usage on the device. This value is system-wide CPU utilization as a percentage.
Single Thread CPU The usage of a single thread on the device. The value is the CPU utilization of the thread which has the highest utilization at that time.
Net CPU Frequency The net CPU frequency in MHz on the device.
Single Thread CPU Frequency The CPU frequency of a single thread in MHz on the device.
Application Memory Used The memory used by processes corresponding to the application under test on the host device.
Application Memory Used Percent The percentage of Application Memory Used. The value is normalized. 100% means the application processes use all available memory on the device.
Memory Used The amount of memory used on the device.
Memory Used Percent The percentage of memory used on the device.
I/O Read Syscalls The number of read I/O operations, such as syscalls like read() and pread().
I/O Bytes Read The number of bytes fetched from the storage layer.
I/O Write Syscalls The number of write I/O operations, such as syscalls like write() and pwrite().
I/O Bytes Written The number of bytes sent to the storage layer.
Battery Temperature The temperature of the device battery in degrees Celsius.
Battery Voltage The voltage of the device battery in Volts (V).
Battery Current The electrical current discharging from the device battery in milliamperes (mA).
Battery Energy Drain The battery energy drain in millijoules (mJ) since capture start.
Battery Energy Drain Percent The percentage of battery energy drain used on the device since capture start.
Janky Frame Count The number of dropped or delayed frames.
High Input Latency The number of input events that took more than 24 ms.
Slow UI Thread The number of times the UI thread took more than 8 ms to complete.
Frame Deadline Missed The number of frames that missed the 16 ms deadline.
Total Views The number of Views for the layout.
Frame Render Time p50 Median frame render time (ms).
Frame Render Time p90 90th percentile frame render time (ms).
Frame Render Time p95 95th percentile frame render time (ms).
Frame Render Time p99 99th percentile frame render time (ms).
GSM RSSI The GSM (Global System for Mobile) received signal strength indication, rescaled to the range 0 to 1. A higher value means stronger signal.
GSM Bit Error Rate The bits that have errors, as a percentage of the total number of bits received. This quantity is reported by the device at 8 discrete levels, the lowest being 0.14% and the highest being 18.1%.
CDMA RSSI The CDMA (Code-Division Multiple Access) received signal strength indication, rescaled to the range 0 to 1. A higher value means stronger signal.
CDMA EC/IO The quality and cleanliness of the CDMA signal, rescaled to the range 0 to 1. A higher value means higher quality.
EVDO RSSI The EVDO (Evolution-Data Optimized) received signal strength indication, rescaled to the range 0 to 1. A higher value means stronger signal.
EVDO EC/IO The quality and cleanliness of the EVDO signal, rescaled to the range 0 to 1. A higher value means higher quality.
EVDO SNR The EVDO signal-to-noise ratio, rescaled to the range 0 to 1. This corresponds to the values reported by the device of 0 and 8, respectively. A higher value means higher signal-to-noise, and a value of 1 corresponds to the highest signal-to-noise the device can report.
LTE RSSI The LTE (Long-Term Evolution) received signal strength indication, rescaled to the range 0 to 1. A higher value means stronger signal.
LTE RSRP The LTE reference signal received power, rescaled to the range 0 to 1. A higher value means stronger signal.
LTE RSRQ The LTE reference signal received quality, rescaled to the range 0 to 1. A higher value means higher quality.
LTE RSSNR The LTE reference signal signal-to-noise ratio, rescaled to the range 0 to 1. A higher value means higher signal-to-noise, and a value of 1 corresponds to the highest signal-to-noise the device can report.
LTE CQI The current LTE channel quality indicator, rescaled to the range 0 to 1. A higher value means higher quality.
WCDMA Signal Strength The WCDMA received signal strength indication, rescaled to the range 0 to 1. A higher value means stronger signal.
5G NR CSI-RSRP The 5G NR channel state information-reference signal received power, rescaled to the range 0 to 1. A higher value means stronger signal.
5G NR CSI-RSRQ The 5G NR channel state information-reference signal received quality, rescaled to the range 0 to 1. A higher value means higher quality.
5G NR CSI-SINR The 5G NR channel state information-signal to interference noise ratio, rescaled to the range 0 to 1. A higher value means higher signal-to-noise.
5G NR SS-RSRP The 5G NR synchronization signal-reference signal received power, rescaled to the range 0 to 1. A higher value means stronger signal.
5G NR SS-RSRQ The 5G NR synchronization signal-reference signal received quality, rescaled to the range 0 to 1. A higher value means higher quality.
5G NR SS-SINR The 5G NR synchronization signal-signal to interference noise ratio, rescaled to the range 0 to 1. A higher value means higher signal-to-noise ratio.
WiFi Signal Strength The WiFi received signal strength indication of the device, rescaled to the range 0 to 1. A higher value means stronger signal.
Latitude The device latitude location in degrees.
Longitude The device longitude location in degrees.

Conclusion

As you’ve seen, the Waterfall UI contains a huge amount of information to help you leverage your tests for better business. It’s a lot to dive into, so hopefully this guide has given you some assistance in navigating and understanding the tools at your disposal. If you have any questions, you’re always welcome to reach out to your HeadSpin administrator for more help.