RSS
Search

Getting Fast and Staying There

Getting fast is difficult, staying fast even more so. How do successful companies ensure that their site speed stays at peak?

If you've ever trained for a big sporting event: a marathon, bike race or similar, you'll know how tough it is to train to get to your peak performance. You'll also know that after the event you feel totally drained, your performance will not be the same for weeks while your body recovers. Training for a tough physical event is a project, a focused effort on getting to a goal. This approach does not work for ultra-marathoners or during the Tour de France, though: their performance must be maintained for days, weeks or months. These elite athletes treat their performance as a feature, something they devote everything to, rather than a project to be executed.


Site speed is seen as a technical problem: bits and bytes, microseconds, three-letter-acronyms. Solving site speed is seen as business success, though: faster experiences lead to happier users and better business outcomes.

Site speed is not clear-cut: we know that faster is better, but how much faster and how much better? This varies between sites and there can not be a single answer. If we look at security & accessibility - there are (conceptually) simple end-goals: AA accessibility rating or a successful penetration test. Feature development is similarly simple: did the feature meet the requirements, and did it improve business metrics?

Site speed is an art: orchestrating technology to deliver a fluid and choreographed user experience.

The only way to truly succeed with a site speed project is to make speed a feature: projects end but features survive as long as they deliver value. Features are shared top-down from product to engineering, not the other way around. Features get investment, projects get budgets.

A common challenge when making performance a feature is determining the value that it will bring. Looking at the relationship between performance and user outcomes helps to build the business case, but it doesn't tell the whole story. Take for example the YouTube Feather project: a super lightweight version of the YouTube web app. This project reduced the page weight of the video page from 1.2MB to under 100kB and dropped the number of requests by a factor of ten, resulting in a much faster video experience. The real user data, however, showed that page load time increased on the lighter, faster pages. The reason? Users who were previously unable to use the web app due to multi-minute load times could now watch videos! The distribution of traffic from low-end devices and countries with poor connectivity increased, pulling the average load times up.

...entire populations of people simply could not use YouTube because it took too long to see anything. Under Feather, despite it taking over two minutes to get to the first frame of video, watching a video actually became a real possibility.

As Anna Debenham says: performance is a basic multiplier. Making your applications faster means they are more accessible to users in the long tail of the performance distribution. A performance feature may bring you significant increases in traffic from new users, those that had previously given up on your application due to bad prior experiences. This is an unknown unknown: you don't know how many users don't use your application!

Site speed is technically a solved problem: ship small files, fast. Unfortunately this goes against the trend for JavaScript frameworks, rich media content, rapid feature development and of course the many third-party services which promise to offer increased revenue or insight.

Making speed a feature reduces social loafing by bringing focus to a common goal.

Front-end engineers in the most part know how to deliver a fast experience. QA knows when a release feels slower. Product knows that faster speeds result in better user outcomes (see my blog post on that here). In fact, anyone who has anything to do with the website should care about speed, and thus has some responsibility for it. This often creates a diffusion of responsibility: when everyone shares responsibility, no-one takes responsibility.

Successful speed goals are simple. My favourite: don't get slower.

Simple goals are the easiest to achieve. Setting a goal as simple as "don't get slower" may seem trivial, but it raises many questions: How fast are we? How do we know? How do we know when it changes? These goals should be shared across the business. In my experience successful site speed initiatives share the following common traits:

  • Speed is understood and supported from the top (CTO has a speed dashboard)
  • Speed data is socialised and celebrated (live dashboards for all teams, performance budgets)
  • Engineering is empowered to focus on speed (tooling, speed week, speed as a feature)
  • Speed is measured in CI & QA (CLI tooling)
  • Product design accounts for speed (prioritising key visual elements, designing for data-saver mode)
  • Speed is measured in-live, for real users on real devices (real user monitoring)

These can only be achieved when the whole business understands the value of speed, invests in it as a feature and celebrates it as a key to success. There's no point monitoring performance if no-one does anything with the data.

One way to affect positive change is through gamification. Folks love gold stars 🌟: creating performance budgets and goals can help drive an initial performance project to a successful outcome. Be careful with performance budgets though, they can encourage complacency once met and often focus on a small subset of static performance measures. I suggest finding the performance metrics which best correlate with user outcomes on your application and create a goal based on that, such as 75% of our mobile users achieve a largest contentful paint of less than 2.5 seconds on landing pages. This metric also happily matches Google's goals for optimum SEO benefits.

There's no point monitoring performance if no-one does anything with the data

Investing in site speed might be a leap of faith, but it is one which will always pay off. Start by measuring what matters, then start tracking improvement.