3

This question will likely have no single correct answer, but I'm looking for personal estimates on how long it takes you to complete a usability study from start to finish, including research, writing the plan, recruiting users, conducting the tests, and analyzing/reporting the data.

I've read a number of articles on the web (this one seems to have the most science behind it: Cost of User Testing a Website, but I'm looking for your input based on your own experience with usability testing.

3 Answers 3

2

Time spent on an usability test is variable. The amount of time spent on research, writing the plan and recruiting users depends on the project/website. I've worked on projects where the target audience was pretty obvious. Research time wasn't really necessary and recruiting users was easy because I knew just where to look for them. Writing your plan gets easier every time you do an usability test. You get better and you can use stencils from previous plans.

Only time spent testing can be predicted relatively precisely, which is 2,5 hours. That's based on 5 test users times a half an hour per user.
The Nielson Norman Group (leading in conducting usability tests) has concluded that you can do tests with as little as five users.
The half hour rule is just something I learned at school. You need at least half an hour to conduct a test, but user's concentration decreases rapidly after those 30 minutes.

I also normally spent a full day processing the data and analyzing it. A second day is spent on discussing the results with colleagues or clients.

1
  • Paul, thanks for your thoughts. I'm looking for information that backs up the findings of the NN/g that states "It takes 39 hours to usability test a website the first time you try. This time estimate includes planning the test, defining test tasks, recruiting test users, conducting a test with five users, analyzing the results, and writing the report. With experience, Web user tests can be completed in two work days." With estimating just about anything, the typical answer of "it depends" certainly applies, I'm just looking for estimates based on experience. Thanks!
    – MCRXB
    Commented Jun 6, 2014 at 13:30
0

Gauge their interest

For guerrilla or in-person interviews, I schedule 30 to 60 minutes per user. That range represents the anticipated level of interest. IOW, if the product/feature is something the user will be very interested in contributing to, I'll block out 60 mins. For most things, any more than 30 will scare subjects away.

Limit the scope

That time is based purely on attention span. For that reason, it's important to test a limited set of features. I find that this works out to one normal workflow/path or the evaluation of one view and it's actions. Cramming too much into the session will result in rushed, unreliable information.

If you plan to do any internal testing, double that time factor. Internal resources tend to have a lot to say. You will often end up having something of a therapy session. I consider this a service to the company ;-)

Don't forget recap time

On the back-end, you should allow yourself 2-3x that time to evaluate and create a recap for other team members. Each one may not take that long, but you'll also want to create some kind of matrix to reveal patterns.

0

I Normally, took a whole day in processing and analysis of the data after that I spent second day by reviewing  it and thired day  discussing the results with colleagues or clients.

Not the answer you're looking for? Browse other questions tagged or ask your own question.