(This is Part 2. Please read Part 1 first.)
There is a limit, I think, to what a so-called “empirical” user interface test can tell you. At some point, the results must be interpreted in order to be useful as a design tool — and interpretations can easily go wrong. They can overlook a critical objective or even reach the wrong conclusions, especially when interpreted by people without the appropriate design skills.
Eyetracking (breathlessly called the “future of web design” by one writer) is a great example of a “design tool” that is getting a lot of buzz lately. Jared Spool says it’s probably not worth it, and I tend to agree with him there. My first objection is that an eyetracker can tell you what people are looking at, but not necessarily what they are seeing (or why they are looking at it). Secondly, the results generated by an eyetracking study are, to a good UI designer, rarely surprising. Finally, as with any analysis of subjective experience, the results can be easily misinterpreted.
I am not saying that eyetracking is useless. Seth Godin has a nice little story about how eyetracking gave him some useful insights into how people saw his web site and web sites in general. What’s great about Seth’s comments is that he sees the eyetracking results not as a definitive scientific measurement of his web site’s effectiveness, but as just another peice of information that he will use, as a seasoned marketing strategist with good design instincts, to plan his site’s design.
That’s the key: he’s a seasoned strategist with good design instincts. In his hands, this data is likely to be useful to just the right degree: that is, not too much.
Stating the Obvious
A leading Eyetracking company, Eyetools, has a fascinating blog with dozens of case studies of real eyetracking studies with heatmap examples, each with a short analysis and interpretation of what the heatmap means and how the design can be improved. In almost all cases, of course, Eyetools seeks to give the impression that without the eyetracking studies we (and the site’s owners) would have no idea how or if each design was working.
What strikes me about the analyses is (a) they mostly seem like pretty good conclusions (although there are some overblown examples, IMHO, based on pretty skimpy data), and (b) the good conclusions seem to be the same conclusions that a good UI designer (one who understands the desired effect of the design) would come up with without the aid of any tools, just going by their design instincts.
In another example (right), a clearly atrociously-designed page is given a slightly- but- noticably- better graphic design treatment (with more color differentiation, more varied typography, and an more compartmentalized layout), and performs markedly better on a follow-up eyetracking test. Is this supposed to be surprising?
Interpreting Research to Influence Design
In the case of eyetracking, the risk of reaching deceptive conclusions that will, in turn, lead to bad design decisions would seem to outweigh the potential positive outcomes, especially in an organization lacking experienced designers to properly evaluate the results.
If, for example, an eyetracker tells you that people don’t spend time looking at your company’s logo, does that mean that you need a new logo, or does it mean that your logo is already deeply familiar to the user? If the eyetracker shows someone spent a lot of time looking at your “how it works” diagram, does that mean that the diagram was extremely interesting or that it was extremely perplexing?
In the wrong hands, this sort of data can easily be intepreted in entirely the wrong way. A little bit of design knowledge is, indeed, dangerous.
Differentiating Between Design and Content
On the IXDA discussion list recently, Jared responded to an already-skeptical comment by Todd Warfel on the topic of eyetracking:
TODD: We have a client right now who’s undergoing a redesign. They can’t figure out for the life of them why their promos for signing up new customers aren’t working.
JARED: Maybe it’s because it’s something nobody wants? If so, no amount of eye tracking will help.
In this example from the Eyetools blog (right), it is noted that blog readers skim headlines but don’t read body text, and that this might be because of the writer’s inability to capture the reader’s imagination. I hate to keep picking on these guys, but really is this conclusion (that people skim headlines but don’t read body texts) something we need an eyetracking test to reveal, or is common sense enough to tell us this?
But it Can’t Hurt. Can it?
Again, it is possible that, like chicken soup, eyetracking can’t hurt a redesign process (although, as I’ve said, grossly misinterpreted results are clearly a risk). But unless you are working with graphic designers with no talent whatsoever (and I will admit that most web sites seem to suffer from this predicament), it’s hard to believe that the recommendations of an eyetracking study would be a meaningful influence on a design process, much less form the very basis for the redesign. The money spent on this sort of research would, in my opinion, be better spent on hiring a better UI designer. Hell, you should probably hire a better UI designer anyway, because only a good UI designer is really qualified to interpret the results of an eyetracking study in the first place.
But what if you do hire a great designer with a winning track record, but your bosses don’t believe in the designer’s recommendations? Even if your superstar UI designer can easily distinguish between a design that works and one that does not, many other people cannot make that distinction as easily. Some people’s design instincts are not only dull, but are often dead-wrong. Oftentimes these people are in senior management positions.
So how does a designer or a design manager convince their boss that a good design decision is in fact a good design decision if the boss has no design instincts? What if the site won’t get redesigned at all unless the boss can be convinced that the current design stinks?
Hm, maybe that eyetracking study can come in handy after all…