WebPageTest's Guiding Principles

I’ve been using WebPageTest for somewhere around 11-12 years now. I don’t want to date anyone else, but the reality is many of the folks working on it since it found a home at Catchpoint were avid WebPageTest users long before joining to work on it full-time.

It’s safe to say it’s important to us that we do right by it and we have countless ideas for how to improve it and make it even more powerful.

But with so many decisions to be made, it’s important that we provide some sort of framework around how we think about the tool. All tools have opinions, whether or not they realize it. By making them explicit, you ensure that everyone involved has a clear picture of what those opinions are. Most importantly, you also provide everyone with a framework for making decisions about the product so that we know that the decisions made align with the broader vision for the product.

For WebPageTest, we settled on three guiding principles. We use them to frame our overall strategy and we bring them up whenever conversations pop up about the future.

When we consider and prioritize features and functionality, we consider how well they align with the principles we’ve established to help us identify the right way forward. The tighter the alignment, the more confident we are that we’re making the right decision.

anchorWebPageTest’s Guiding Principles

Here’s what we’ve been using to guide our decisions around WebPageTest for the past few years.

Principle 1: Make the right thing easy

There are a lot of things that we say are important in performance, but if those things are hard or complicated (and many of them are) then most folks will skip right over them.

We focus on making it as easy as possible to do the things that we know are important to create, monitor and maintain better performing digital properties.

Principle 2: Always answer “so what”?

A lot of data-oriented tools surface metrics and data simply because they can, without thought about whether it's actionable or not. This leads to a mix of products that are too open-ended (massive data lakes) or too restrictive (limited view of metrics with no place to dig in).

We should always ask "so what?" when we're looking to add a new metric or visualization. We want to make sure that, wherever possible, we provide actionable insights for each, and we want to make sure that our users have the ability to explore further to learn additional context.

Principle 3: Close the gap between "something is wrong” to “we fixed it”

One of the easiest things to do is point out when something is wrong. Fixing it is much more difficult. When there is a large gap between seeing that there’s an issue, understanding why it’s an issue and then knowing what to do about it, teams lose trust in their tools and metrics.

We always look for ways to help users quickly fix their issues so they can get on with the rest of their work.

anchorPutting our principles into action

Our guiding principles are written up in an internal document that the entire organization has access to, and we reference them regularly. It’s important to us that everyone, whether or not they directly on WebPageTest, has access because they aren’t just there to help guide product decisions, but marketing, sales and more.

When we consider improvements to WebPageTest, we use these principles (alongside data and user feedback when possible) as a litmus test both for whether or not the improvement makes sense in the first place, as well as to see if there are refinements we can make to our ideas to make them better align.

For example, when we rolled out our “simple test configurations” last year, we did it because we wanted to make the right thing easy, and we believe the right thing is to test in multiple browsers, from less-than-ideal connections, on both mobile and desktop devices, from a variety of locations relevant to your audience. More on how we thought about those configurations, and the impact of that change, in another post coming in a couple of days.

A screenshot of the WebPageTest simple configurations, showing some preset locations, connections and browser types.

Opportunities and experiments came out last year too, and they’ve been a huge hit. We’ve been really pleased with the feedback, but it also wasn’t entirely a surprise. The combination of those two features aligns directly with all three of our guiding principles, so we were confident it would resonate.

A screenshot showing the Lazy-Load Images outside the critical viewport opportunity, with a one-click experiment to add the loading attribute and test the impact

Opportunities help to answer “so what” by surfacing recommendations based on the metrics and indicators in the page.

The “right thing” to do with a potential optimization is not just to drop everything and do it, but to first test to see what the impact will be and make sure it has the intended effect. No-code experiments make that entire process a button click.

And both opportunities and experiments close the gap between “something is wrong” to “we fixed it” because together, they take you through the process of identifying, recommending and validating in just a few moments.

I’d love to give you an example of something we discarded because it didn’t align with these principles, but honestly, those ideas tend to not last very long because as soon as we see something counter one of these, we tend to move on pretty quickly.

A few of the team met recently in person to think about the upcoming year and what we want to do, and these principles came up often. As a result there's a laundry list of features and refinements we have in mind, including finding ways to start taking advantage of the wealth of information the rest of the Catchpoint product suite provides to provide even more meaningful insights when those products are in use.

We’ll continue to revisit these principles and refine them over time, but the core values will remain consistent.

We believe we have a responsibility with WebPageTest and being able to use these principles to frame our decisions have been an invaluable tool in making sure that the improvements we make to WebPageTest really are improvements.

Tim Kadlec is the Director of Engineering for WebPageTest, a web performance consultant, and trainer focused on building a web everyone can use. He is the author of High Performance Images (O'Reilly, 2016) and Implementing Responsive Design: Building sites for an anywhere, everywhere web (New Riders, 2012). He writes about all things web at timkadlec.com.

@tkadlec
Banner ad that says Prototype perf optimizations in minutes, not months.