Pinterest Logo
“This peace of mind from preventing regressions gives us the space and time to focus on what matters most to us — delivering an amazing experience to Pinners around the world.”
Piterest case study HeadSpin

Case Study: PINTEREST

The Story

Pinterest is the biggest dataset of ideas ever assembled, with over 100 billion recipes, life hacks, design ideas, and fashion inspiration, and more than 250 million users around the world.
“With more than 75% of Pinterest sign-ups coming from outside the U.S., we needed to speed up our Android app and maintain that speed even under constant development and new feature additions that might slow things down.”
Pinterest Performance Test

Challenges

User experience is crucial to Pinterest’s success. The Pinterest performance team ran what they called a “hacky experiment” that increased mobile-web home-landing page performance by 60% and mobile signup conversion rate by 40%. Their experiment proved that the faster their app loads, the higher the conversion rate and user experience.

They faced two challenges:

Tick mark
Increasing Pinterest’s speed for all users regardless of device type or location
Tick mark
Increasing the speed on Android devices specifically (Pinterest ran slower on Android devices compared to Apple devices).
User experience test
Solution

HeadSpin Nimble, a cloud-based continuous performance testing tool

The Pinterest Performance Team implemented Nimble, a HeadSpin product, to create regression tests that run on Android builds generated from code changes and alerted Pinterest when a PWT metric exceeded designated thresholds.

When there were no alerts, they knew the app was good to go, which boosted confidence in merging pull requests and releasing it to Pinners.

When they did receive alerts, they resolved the regressions in ~21 hours, which is far less time than the multiple days it previously took the team to identify and fix a regression.
Inverted comma
Seeking testing precision and alert capabilities, we chose to use HeadSpin Nimble, a cloud-based continuous performance testing tool that easily integrated with our process and produced actionable results.”
ReSUlTS

“We rely on HeadSpin Nimble for insight into how we’re doing against our goals to not only maintain but also improve PWT (Pinner wait time) across our platforms.”

Test Pinterest App
test on HeadSpin
faster in-app load times

60% faster in-app load times

regression intelligence

Detected 30 slowdown regressions, which would have added 3 seconds to the overall load time for each user if they had not been detected

Regression analysis

Regressions solved in 21 hours vs. multiple days historically

catch defects earlier

Regression caught earlier in the cycle and never released to users

How they measured results

“For every main action on Pinterest — like loading the home feed, tapping to see a Pin closeup or searching — we defined. A metric called Pinner Wait Time (PWT). Each metric measured the time from when a Pinner initiated an action (e.g. tapping a Pin) until the action was considered complete from the Pinner’s perspective (i.e. the Pin closeup view loaded).” - Pinterest Performance Team.
Inverted comma
With HeadSpin Nimble we detected ~30 slowdown regressions over the course of six months. Since our alert threshold was 100 milliseconds, these regressions, if released to our production app, would have accounted for at least three seconds of additional wait time. Aggregated over the hundreds of millions of actions taken each day, it had the potential to be a significant amount of time wasted. By catching regressions early in the cycle, they were never released to Pinners.
Book a demo with HeadSpin
Solution

“Seeking testing precision and alert capabilities, we chose to use HeadSpin Nimble, a cloud-based continuous performance testing tool that easily integrated with our process and produced actionable results.”

The Pinterest Performance Team implemented Nimble, a HeadSpin product, to create regression tests that run on Android builds generated from code changes and alerted Pinterest when a PWT metric exceeded designated thresholds.

When there were no alerts, they knew the app was good to go, which boosted confidence in merging pull requests and releasing it to Pinners.

When they did receive alerts, they resolved the regressions in ~21 hours, which is far less time than the multiple days it previously took the team to identify and fix a regression.
Published in 2018
continuous performance testing