Better Performance Testing Through the Power of Defaults

We recently wrote about the guiding principles that we’ve been using to help us make decisions about what improvements we want to make to WebPageTest.
One example of a feature that was driven by these principles are the Simple Test Configurations that we rolled out last January.

I wanted to walk through that change, why we made it, and what the results are because it’s a good example of how those principles guide our work and what goes into what, on the surface, was a very simple addition.
anchorThe historical default
Prior to adding our simple test configurations, the default test settings for WebPageTest were to run a test on
- Chrome
- On a Desktop
- Over a Cable connection
- From Virginia
Defaults matter a lot. As a result of this default, this test configuration was our most popular setting, accounting for 29.7% of all tests.
There were some problems with that.
We provide over 30 test locations around the world. We use network throttling at the packet level, allowing us to accurately simulate a variety of network conditions. We don’t just test in Chrome, but in Firefox, Edge, Brave and (on our physical M1’s) Safari.
While all of those settings, and more, are available in the advanced settings, that also meant they were tucked away a bit so folks had to go and search for them.
One of our guiding principles is to “make the right thing easy”.
When it comes to testing performance, we believe the right thing is:
- Test in multiple browsers
- Test both mobile and desktop browsers
- Test from a variety of relevant locations
- Test from less-than-ideal network conditions
Testing in Chrome on a desktop device, using a Cable connection, from the US is about as ideal a test scenario as you can have. Meaning, it's going to mask a ton of performance issues. We wanted to encourage better test settings.
Our theory is that if we were careful about these simple configurations, we could highlight some of the features we feel are very valuable, while also encouraging more testing in different conditions which would help users to identify performance issues they may otherwise miss.
anchorAdding some simple configuration defaults
We knew we couldn’t just completely move away from the Virginia location as a prominent default. Folks have been used to that for over a decade and to alter that overnight would be disruptive and confusing.
But we knew we could do better.
So we looked at browser and network statistics to identify geography, network and browser combinations that have a substantial amount of traffic.
For example, Firefox traffic in Germany is significantly higher than in most other locations. So to encourage Firefox testing, using Germany as a location made sense.
With the new UI, we settled on 5 simple configuration settings:
- Chrome on an emulated Moto G4, over a 4G connection, from Virginia
- Chrome on desktop, over a cable connection, from Virginia
- Chrome, on an emulated Moto G4, over a 3G connection, from Mumbai
- Edge, on desktop, over a Cable connection, from Toronto
- Firefox, on desktop, over a Cable connection, from Frankfurt
anchorThe impact
Three months after the change, we looked back at our data to see the impact. We compared the three months prior to the change with the three months after the change to see what, if any, difference our simple configurations made.
Location impact
First, let's look at the impact on test location itself.
Location | % of Total Tests Before | % of Total Tests After | % Difference |
---|---|---|---|
Virginia | 39.76% | 58.92% | +48.19% |
Mumbai | 2.34% | 2.71% | +15.81% |
Toronto | 1.02% | 1.44% | +41.18% |
Frankfurt | 7.45% | 6.25% | -16.11% |
Right away we can see the impact on three of the four locations used for the simple configuration. Three have significant increases, with only Frankfurt decreasing.
The increase in Virginia in particular is interesting given it was already the default. But we did use it for two simple configurations (the top two in the list) so we basically doubled-down on the default effect there. We wanted to lean into that familiarity to provide a consistent experience, but in doing so we made its dominance even more significant. We'll evaluate potentially replacing one of those two locations with another one.
Configuration Impact
We can get a cleaner picture by looking at the exact location/browser/connectivity combination before and after the change to see just how impactful those configurations were.
The Toronto configuration % is included, but since we didn't support the Edge browser prior to this update, there is nothing to compare to.
Configuration | % of Total Tests Before | % of Total Tests After | % Difference |
---|---|---|---|
Virginia, Moto G4, 4G | 2.78% | 43.97% | +1481.7% |
Virginia, Chrome, Cable | 29.77% | 9.89% | -66.8% |
Mumbai, Moto G4, 3G | 0.15% | 1.16% | +673.33% |
Toronto, Edge, Cable | 0.00% | 0.78% | N/A |
Frankfurt, Firefox, Cable | 0.18% | 3.43% | +1805.6% |
Looking at configurations makes the impact even more obvious to see. Each simple configuration saw a massive jump, with the exception of Virginia/Chrome/Cable. Considering that was the default before, and Virginia/Moto G4/4G is now, that drop makes perfect sense.
anchorLessons Learned
There are a couple of a big takeaways that stood out to us.
Defaults matter, a ton. This is not shocking given the bounty of research on the topic, but defaults matter. We quietly made a dramatic change in how people test, encouraging testing on mobile devices and networks rather than desktop and Cable, simply by switching that default experience.
We have a lot of influence in how people test. By making certain configurations easier, we saw dramatic increases in the rate those configurations were set. That’s a lot of influence which is why having guiding principles in place—like “make the right thing easy”—are so important. Testing in different browsers (not just Chrome), under different conditions that reflect real-user usage is critical to the health of the web.
As a result, we're exploring ways to make sure that the defaults people see are even more tailored and targeted to their situations.
We feel WebPageTest has a very critical role to play in the overall health of the web, and it's certainly encouraging to see data supporting that we can influence a healthier approach to performance testing through our product decisions.
Tim Kadlec is the Director of Engineering for WebPageTest, a web performance consultant, and trainer focused on building a web everyone can use. He is the author of High Performance Images (O'Reilly, 2016) and Implementing Responsive Design: Building sites for an anywhere, everywhere web (New Riders, 2012). He writes about all things web at timkadlec.com.
@tkadlec