HeadSpin Documentation
Documentation
Changes
Getting Started
Integrations
HeadSpin Platform
API Reference
Mini Remote
Automation
Audio
Biometrics SDK
Performance Analysis
Performance Monitoring
Grafana
On-Premise Deployments
Deprecated Documentation
Legal Matters

Performance Monitoring UI

Whereas the HeadSpin Waterfall UI and Burst UI allow exploring the data and analysis results for a single HeadSpin session, the HeadSpin Performance Monitoring UI allows monitoring key session measurements over time as new sessions are being recorded. Measurements for related sessions are grouped into a User Flow and the Performance Monitoring UI allows exploring all the measurement time series in a User Flow.

A measurement is a single value extracted from a session's recorded data, analysis results, or another aspect of a session. One example of a built-in measurement is "Average Wait", which represents the average wait time across all HTTP exchanges in the session. Using HeadSpin Performance Monitoring, one can monitor how measurements like Average Wait change over time as new sessions are recorded and inserted into the User Flow.

The Performance Monitoring UI

The Performance Monitoring UI is accessed from the Performance Monitoring icon in the sidebar:

Performance Monitoring UI

Clicking the icon opens a list of Performance Monitoring User Flows. To create a new user flow, click the "+ New User Flow" button in the top right. Clicking a user flow in the list opens the Performance Monitoring UI for that user flow. To view the inline help, which explains the UI visualization and controls, click the help icon in the bottom left of the sidebar:

Help Icon in HeadSpin

The Performance UI renders a measurement trend graph where each dot in the graph represents a measurement extracted from an individual HeadSpin session, recorded either in the RemoteControl UI or using automation. The orange line represents the currently selected measurement (in the screenshot below "Total TLS") extracted from sessions recorded on the selected device (in the screenshot below "MOTOROLA XT1053"). The gray dots represent the selected measurement extracted from all sessions recorded at the "Locations to Compare Against". This allows visually comparing the performance of sessions recorded on a particular device to other sessions in this User Flow.

comparing the performance of sessions recorded

Clicking on a dot opens a menu with a link to the session from which the measurement was extracted:

Go to session

Attaching Sessions to User Flows

This section explains how to attach an HeadSpin session to a user flow from the UI. For equivalent API functionality, see Performance Monitoring API.

To manually attach a session to a User Flow from the HeadSpin UI, open the session in the Waterfall UI or Burst UI and click the "Add to User Flow" button in the top left:

Adding sessions to user flow

From the dialog, you can choose to add this session to an existing user flow by selecting from the dropdown, or add it to a new user flow by clicking the "Create new user flow" button:

Add a session to a user flow

Each session in a user flow is associated with a status. Currently, the available statuses are Passed, Failed, and Excluded. Note that only sessions marked "Passed" are currently included in the Performance Monitor UI time series. If a session is subsequently marked Failed or Excluded, its measurements will be removed from the time series.

Once you choose a User Flow and a status and click "Add Session to User Flow", HeadSpin analysis will process the session and add its measurements to the user flow. Analyzing the session may take some time, so you may not see it in the user flow for up to a minute.

Exporting User Flows to the Replica DB

See Replica DB API.