Nielsen’s web placebo considered harmful

“If your poem isn’t working, cut off the last line. Repeat as necessary.” – Michael Donaghy

There’s a curious statistic floating about the internet. It’s not always expressed in exactly the same words, but it comes from Jakob Nielsen.

“On the average Web page, users have time to read at most 28% of the words during an average visit; 20% is more likely.”

Almost every part of this claim might benefit from elucidation. Average web page? Average visit? The time users have? Those reassuringly specific numbers?

Nielsen based his findings on data from a German study. 25 academic staff had their web browsing habits monitored over an extended period, and the authors provided Nielsen with their data – just under 60,000 page visits. He then excluded all page visits:

  • lasting more than 10 minutes or fewer than four seconds
  • to pages with fewer than 30 words

Assuming a reading speed of 250 WPM, he graphed the maximum percentage of words readers could have read, given the time they spent on each page, against the number of words on the page, noting that this percentage falls very rapidly as page length rises. Further noting that the average (presumably mean) page length in the data set was 593 words, he looked at the corresponding point on his Y axis and arrived at his 28% upper bound. 20% is then a guesstimate to allow for user orientation before reading begins.

We now have some definitions:

  • Average web page: a web page of 593 words.
  • Average visit: the average time some staff at an academic institution spent reading web pages 593 words long.
  • Have time: the time the reader happened to spend on the page.

A very bad thing?

But let’s suppose that people really are reading 20-28% of web pages. Generally this news is delivered mournfully, often prior to some hectoring on why we should be taking a butcher’s knife to our content. I haven’t seen anyone recommend a specific number of iterations, but it’s clearly a procedure with rapidly and literally diminishing returns.

But why would we ever expect anyone to read 100% of a page in the first place? Could 28% actually be a generous percentage? Here, perhaps, there’s a bit of sleight of hand. When we talk about an average page, average takes on its usual connotations of regular, typical, standardordinary. Content you might hope some users would read from start to finish. But the study Nielsen got his data from looked at all web browsing, not just content intended to be read in full – presumably including, amongst other things:

  • search results (the median time spent looking at Google search results was reported as being eight seconds, so many of these would have been included by Nielsen)
  • navigation pages and directories
  • calendars, weather forecasts and other services where we may just want a single piece of information.

For search results and navigation, we would generally be happier if users found what they wanted quickly. If the first Google result is the one I want, I may have only read five percent of the page. That isn’t a sign that Google needs to work on creating stickier content, it’s a sign that Google’s search algorithm is working well. Similarly, weather forecasts and calendars may just be glanced at, but if four seconds or more elapse before the user moves on, they get scooped into the data-set.

A web placebo?

Perhaps the (mis)use of this statistic is most charitably thought of as aphoristic, as a useful provocation rather than something meant to reflect an underlying truth – as a web placebo. But when such shibboleths harden into points of doctrine, it’s probably time to consider whether they do more harm than good.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>