Expert reviewing a heuristic evaluation

I recently changed jobs and now work as a User Experience Consultant at Interaction Design Studio. Since starting I’ve been doing a lot of work conducting heuristic evaluations (HE).

Screen shot: Expert Review

Just to recap, a HE is an inspection method or review conducted by a usability expert, of a website which complies to widely accepted (and even adopted) design principles. These design principles (called heuristics) are what is considered standard practice or rather best practice. For example, when submitting form data the system should inform the visitor that ‘something’ is happening – processing, validating, checking, submitting – something that keeps the visitor informed. An ‘official’ widely used set of heuristics are Jakob Nielsen‘s ‘Ten Usability Heuristics‘.

A practitioner involved in the field of user experience conducting HEs is an essential skill and dare say one that should be mastered. Having the ability to pick apart a website and analyse its strengths and weaknesses has many benefits, notwithstanding cost-benefits and speed to conduct. Often the findings provide insights which allow website owners to fix the quick and easy issues, the low hanging fruit fixes. In most instances conducting a HE serves to highlight potential flaws and usability failings, but also suggest or recommend ways to fix or correct the issues.

I’ve been reviewing and reading HEs conducted by other practitioners. I find it kind of interesting to read their assessments. Often they spot issues which I may have missed, or articulate the problems differently. Reviewing HEs is also a good way off checking work too, making sure there are no errors, and of course it acts as a second pair of eyes strengthening the assessment process.

But, are we writing these reviews with the end-user in mind? Are we using technical terms with explanations? Quite often the reviews I write are written for business managers, website owners and marketers, and not user experience or usability professionals. So should we place more emphasis (on our writing) on our clients? Perhaps we should be writing both a technical and a normalised version? Or should we be documenting our findings providing explanations for technical terms which seem impossible to omit?

My view is that prior to writing a HE be clear about who its recipient is. All HEs or expert reviews should be written in normalised language and where unavoidable provide explanations for technical terms.

What do you think?

About these ads

5 thoughts on “Expert reviewing a heuristic evaluation

  1. Yes, I agree. We write different types of expert reviews for different clients. We have one which is aimed at people in general marketing roles and points out high and low priority UX issues, as well as positives, without using any jargon. For people with a more technical mindset/job role, the deliverable needs to be different.

    • Thanks for sharing your experiences. Would be interesting to know if you ever get feedback about your actual review (not the findings of the review), from the recipient/s?

  2. One of the things that was continuously drummed into me during my management consultancy days was that you write for your reader. (In fact, I think there was actually a course called ‘Write for your reader’.) When you’re in a client-serving business, it’s simply good client service to tailor your writing according to the readership of your reports. It’s actually just good writing.

    That’s easier said than done, of course. Rewriting the dross served up by people who are excellent at what they do but useless at communicating it is what keeps copywriters in work.

  3. Last time when I had to write a “usability assessment summary” (heuristics review + comments from a rough usability testing session) my manager asked me to re-write the first draft and “dumb it down” a little as it was “too technical” and “too detailed”. I got it right after a few iterations. The document was supposed to be distributed to technical people responsible for the website and non-technical managers — and writing two different versions wasn’t an option. I managed to satisfy both groups by moving all technical stuff to the appendix.

    Or I like to think I satisfied them as I didn’t get any real feedback apart from my manager’s initial comments. They did comment of findings but no-one mentioned the structure or the language I used.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s