HeadSpin Documentation

Performance Monitoring UI

Whereas the HeadSpin Waterfall UI and Issue UI allow exploring the data and analysis results for a single HeadSpin session, the HeadSpin Performance Monitoring UI allows monitoring key session measurements over time as new sessions are being recorded. Measurements for related sessions are grouped into a User Flow and the Performance Monitoring UI allows exploring all the measurement time series in a User Flow.

A measurement is a single value extracted from a session's recorded data, analysis results, or another aspect of a session. One example of a built-in measurement is "Average Wait", which represents the average wait time across all HTTP exchanges in the session. Using HeadSpin Performance Monitoring, one can monitor how measurements like Average Wait change over time as new sessions are recorded and inserted into the User Flow.

The Performance Monitoring UI

The Performance Monitoring UI is accessed from the Performance Monitoring icon in the sidebar:

Performance Monitoring UI

Clicking the icon opens a list of Performance Monitoring User Flows. To create a new user flow, click the "+ New User Flow" button in the top right. Clicking a user flow in the list opens the Performance Monitoring UI for that user flow. To view the inline help, which explains the UI visualization and controls, click the help icon in the bottom left of the sidebar:

Help Icon in HeadSpin

The Performance UI renders a measurement trend graph where each dot in the graph represents a measurement extracted from an individual HeadSpin session, recorded either in the RemoteControl UI or using automation. The orange line represents the currently selected measurement (in the screenshot below "Total TLS") extracted from sessions recorded on the selected device (in the screenshot below "MOTOROLA XT1053"). The gray dots represent the selected measurement extracted from all sessions recorded at the "Locations to Compare Against". This allows visually comparing the performance of sessions recorded on a particular device to other sessions in this User Flow.

comparing the performance of sessions recorded

Clicking on a dot opens a menu with a link to the session from which the measurement was extracted:

Go to session

When choosing the data you wish to view within a User Flow, you can set some parameters to customize your view. You can select a specific dataset/issue, a specific device, or a specific location to compare data from the dropdown selections at the top of the screen. All three of these options also display an icon that looks like a square; clicking this icon will display one chart per dataset, giving you a view with several charts displayed for easy comparison.

Choosing a data set

You can also modify the date range of your dataset by clicking on the date range displayed beside the calendar icon in the upper-right area of the page..

Modify date range

When you click this date it pulls up a calendar pop-out box, with the range of dates your dataset covers displayed in bold. By default, Performance Monitoring displays the full range of your dataset, beginning with the first date and carrying through to the current date on which you are viewing your data; if your tests run through several weeks or months, this means the calendar will display the first month of your dataset.

Selecting date range

Clicking the arrows at the top of the calendar box will navigate you through the calendar months. You can click on a date within the calendar view to select a specific date from which your dataset will start; the change you make will be displayed in the range box at the bottom of the calendar pop-out. Please note that you can only change the 'from' date of your dataset, or rather, the starting date of your displayed dataset; the 'to' date is always set to the current date, as Performance Monitoring always fetches the most recent data as a part of your dataset view. To render your changes in your data comparison view, click "Apply".

navigating date range

Attaching Sessions to User Flows

This section explains how to attach an HeadSpin session to a user flow from the UI. For equivalent API functionality, see Performance Monitoring API.

To manually attach a session to a User Flow from the HeadSpin UI, open the session in the Waterfall UI or Issue UI and click the "Add to User Flow" button in the top left:

Adding sessions to user flow

From the dialog, you can choose to add this session to an existing user flow by selecting from the dropdown, or add it to a new user flow by clicking the "Create new user flow" button:

Add a session to a user flow

Each session in a user flow is associated with a status. Currently, the available statuses are Passed, Failed, and Excluded. Note that only sessions marked "Passed" are currently included in the Performance Monitor UI time series. If a session is subsequently marked Failed or Excluded, its measurements will be removed from the time series.

Once you choose a User Flow and a status and click "Add Session to User Flow", HeadSpin analysis will process the session and add its measurements to the user flow. Analyzing the session may take some time, so you may not see it in the user flow for up to a minute.

Exporting User Flows to the Replica DB

See Replica DB API.