The truth about performance related revenue statistics

Galvanising figures regarding the importance of rapid web page response are frequently released.

Among many others, Google recently reported a measurable diminution in searches associated with a 400 millisecond delay in page response. Amazon capped that, with an impact from a 100 millisecond slowdown. The whole area seems to be developing into some kind of grand game of one-upmanship, in fact.

Don’t get me wrong, site performance is important – extremely important. Anyone who has tried to purchase Olympic tickets last year will attest to the frustration, brand damage and loss of revenue (to the site) associated with that.

And the real-world figures are compelling enough. Uncompetitive response (and the unavailability and inconsistency often associated with it) have been shown to lead to double digit drops in conversion – the equivalent of days or weeks lost revenue over the course of a year.

However, it is important to distinguish between marginal statistics derived from sites with huge traffic volumes such as Amazon, Bing and Google, and impacts on your eCommerce bottom line from abandonment and low conversion. For the record, it takes almost 100 milliseconds for a visual signal to get from the eyes to the object recognition area of the brain – even 400ms is less than the blink of an eye. Want to know how fast 100ms is? Try this “quick reactions” game and try to get close to it.

The real world research alluded to above (from the likes of Aberdeen, Jupiter, etc.) recognise the problem, but typically in the 5 second plus range per page (NB end-user response times, not cloud/ISP tests).

It must be admitted that customer expectations (and benchmark performance) are consistently increasing over time. The migration to mobile computing is another important consideration.

So this post is just a plea for the injection of some reality into expectations and goals from performance, given that none of us live in a world of unlimited IT budgets (and time to implement them).

Some thoughts to conclude:

Performance DOES matter, but

  • Where are you measuring from? Cloud-based testing is useful for certain things, but understanding end-user performance is critical.
  • Above/below the line? Distinguish between perceived (browser fill) times and total response times.
  • Target key browsers / devices – understand site usage across all key markets.
  • Consistency / expectation. It’s not only about speed. A site that is slightly slower than the competition can still win if its performance is consistent, and it excels in other areas (value proposition, effective design).
  • Relativity is as important as absolute speed – to your competition, to bellwether sites (ie where your visitors are when not on your site), and in the context of the overall process.

Finally – Measure, measure, measure – set baselines, set attainable iterative goals – and intervene to achieve them. Take good advice. And win.

6 thoughts on “The truth about performance related revenue statistics

  1. Thanks for this article Larry!

    I couldn’t better summarize my point of view about webperf impacts!
    After having done some tests with a lot of eCommerce websites, the rule is “there’s no rules”.
    For some websites, impact is huge. For others, no or irrelevant impact.
    Measure, do some tests, measure.
    IMHO, A/B testing on KPI (conversion rate, bounce rate, pageviews per visit) is the best thing to do when measuring webperf impact (and that’s we’re doing for Fasterize customers).

    Cheers,

    Stephane, CEO, Fasterize

  2. Pingback: Performance Matters! A lot! Paulo Ortins

  3. Pingback: Webperf is not a matter of technique | Blog

  4. Pingback: 10ms = +25% | Blog

Leave a comment