Home > Evaluation Methods > Heuristic Review

 Heuristic Review

Description of method:

A small number of trained evaluators (typically 3 to 5) separately inspect a technical document by applying a set of "heuristics", which are broad guidelines that are generally relevant to the documentation. They then combine their results and rank the importance of each problem to prioritize fixing each problem. 

The experts read through the documentation looking for errors such as language, the amount of recall required of the user at each step in a process, and how the system provides feedback to the user. In particular, issues such as clarity, consistency, sequencing, missing or unnecessary information, part numbers and values are analyzed. 

The heuristics or shortcuts they use are a set of recognized, common rules or principles which have been formed as “best practices”.  Once the problems are discovered, the experts make recommendations for working out these issues.

Heuristic Review Checklist for technical documentation:

1.  Speak the user’s language with words, phrases and concepts familiar to the user.  The technical depth of your writing should be compatible with your user’s background.

2.  Information should be stated directly.  Technical information should be clear, concise and to-the-point.  Writing should be task relevant and communicate the information directly – no fluff. 

3.  Use command form syntax.  Key components of the syntax are the action verb, i.e. the command verb, and the object of the action verb. 

4.  Be concise – use the “minimal number of words”.  If more than 25 words are required for one sentence, something is wrong.  For standardized command verbs, a maximum of 100 should be completely satisfactory.  Even for complex sets of maintenance tasks, 80% will be covered with 20 verbs or less.

5.  Be consistent – strive for consistency in use of numbers, units of measure, punctuation, equations, grammar, symbols, capitalization, technical terms, and abbreviations.  Users shouldn’t have to wonder if they mean the same thing. 

6.  Eliminate calculations and estimations.

7.  Avoid big words – use short, simple words.  Big important-sounding words may frustrate the reader rather than communicate effectively. 

For example:

Big (Long) words

Simple words









8.  Describe the complete task.   Avoid referencing other documents or sections within the documents in order to complete the task. 

9.  Write to accommodate both experienced and novice maintenance technicians.  Provide high level information first to allow the expert technician to accomplish the task quickly, followed by adequate support information for the novice technician.

10. Break up unnecessarily complex tasks – long, unbroken blocks of text are stumbling blocks to readers.  Breaking up your writing into short sections makes it easier to read.  In the same way, short sentences are easier to grasp than long ones.  A good guide is to write sentences that can be spoken aloud without losing your breath.  (Be sure not to take a deep breath before doing this test!)

11. Prefer specific to the general – be specific whenever possible.  Technical readers are interested in detailed information – facts, figures, conclusions, recommendations.  For example:





A tall spray dryer

A 40-foot tall spray dryer

High performance

95% efficiency

Structural degradation

A leaky roof

12. Provide feedback information regarding results and outcomes.  For a complex task evaluation, provide additional information to allow for detection, diagnosis and recovery of potential errors.  (Nielsen, 1994)

13. Use active voice – action is expressed directly.  This will help make your writing more direct and vigorous; your sentences, more concise.



Control of the bearing-oil supply is provided by the shutoff valves.

Shutoff valves control the bearing-oil supply.

Leaking of the seals is prevented by the use of O-rings.

O-rings keep the seals from leaking.

Fuel-cost savings were realized through the installation of thermal insulation.

The installation of thermal insulation cut fuel costs.

14.  Provide information to let the maintenance technician know where they are within the task.

15.  Avoid CAPITALS or italics as they slow reading and reduce comprehension.  The use of upper case font (all capitals) reduces reading speed by 14% (Tinker, 1963).

16.  Use visuals to reinforce your text.  In fact, pictures often communicate better than words; we remember 10% of what we read, but 30% of what we see. 

Type of Visual:

This shows:


What something looks like


How it is put together

Schematic diagram

How it works or is organized


A body of related data

Mass and energy balances

What goes in and what comes out

Location and identification information should be presented in a graphic format.  What, how, sequence and tolerance information should be text.

Text and the visual should be placed together, i.e. on the same or facing pages.  If this is not possible, place graphics at the end of the procedure; however, this will likely lessen usability.  Explanatory captions which are consistent with the text can often improve some of this problem; however, the more energy required to retrieve information, the less it will be used.

 (Adapted from Bly & Blake, 2000; Nielsen, 1994; Inaba, 1989)

Focused Statement Checklist for technical documentation review: 

Another expert review technique to identify errors in technical documentation is to ask focused questions while reviewing the documentation.

Statements about text:

1.  Language and jargon - Identify unfamiliar words, or words that are used incorrectly.

2.  Sentence and paragraph structure - Identify sentences/paragraphs that are unnecessarily complex.

3.  Comprehension - Provide examples of text that is misunderstood on the first reading.

4.  Organization - Identify where there are too many/too few headings or an overly complex organizational structure.

5.  Sequencing - Make sure that the steps follow a logical order that is easily understood.

6.  Access - Identify any information you couldn’t find easily in the table of contents, index, or other aids.

7.  Correctness - Identify any part numbers, values, tolerances that are incorrect.

8.  Completeness - Identify any steps that may be missing or need further clarification.

9.  Consistency - Identify any language or numeric values which are inconsistent.

10. Feedback - Appropriate checks to allow maintenance technician to know if task performed correctly.

11. Separation - Identify tasks that need to be divided into more steps.

Statements about illustrations:

1)  Quality -   Identify any illustrations that were hard to understand because of the poor quality of reproduction (blurry, labeling too small, etc.).

2)  Comprehension - Identify any illustrations that are too complex or confusing

3)  Position - Make sure that the illustration represents the maintenance technician’s view and all component parts are represented in the correct scale and position.

4)  Correctness - Identify any illustration that is misleading or incorrect, or that contradicts the text.

5)  Completeness - Identify any text/steps that would be enhanced by an illustration.

 Statements about format:

1)  Positioning - Indicate where it was hard to find a table or figure referred to in the text or where the position of an item was confusing or interfered with your ability to read or scan text or seemed irrelevant.

2)  Structure - Indicate any layout techniques that interfered with reading comprehension.

3)  Readability - Indicate where text is too small or tightly spaced or where the visual style made text hard to read.

 (Adapted from Hart, G. 1997)

Development Lifecycle Stage:  This type of evaluation is most beneficial early in the stage of document development.

Type of Evaluators:  Evaluators should be experienced technicians, technical publication writers, engineers, and technical support personnel. Depending upon the complexity of the task, an illustrator might be included in the review process.

Evaluator Skills required to Use the Method:  Evaluators should have knowledge of maintenance procedures, technical writing guidelines, and/or usability training. The critique would normally be based on expertise, psychological principles, and a set of previously-defined guidelines for technical writing.

Number of Evaluators Required:  Heuristic evaluation gains in power when there are several usability experts working independently.  It has been found that 3 to 5 reviewers are optimal.

Advantage(s) of Method: 

  • The key advantages of this method is that it can identify problems very early in the development of the documentation. 
  • Problems or concerns predicted by a heuristic analysis should be candidates for usability testing with the technician.  By using a heuristic evaluation to identify problem areas in the documentation, it can provide a focus for a later user performance evaluation.
  • Heuristic evaluations can produce high quality results in a limited time -- usually two to three weeks, including a report of the findings and recommendations because this method doesn't involve detailed scripting or time-consuming participant recruiting.
  • Heuristic evaluation finds many specific, local problems; however, its advantage is much smaller with the most severe problems (Jefferies, etal. 1991).

Disadvantage(s) of Method: 

  • Experts using heuristic evaluation found 80% of the minor annoyances that users might experience, but only 29% of the problems that were likely to cause task failure (the most severe problems) (Desurvire, 1994).
  • Expert evaluation of the documentation produces results that are not actual "primary" user data.  Real users often have problems we don't expect and don't have problems where experts might expect them; therefore, it doesn't necessarily indicate which problems users will encounter most frequently.
  • Changing many of the problems discovered through heuristic evaluation does not necessarily improve the usability of the documentation.
  • Experts may be difficult to find that have both expert maintenance, technical writing, engineering, and usability skills; therefore, it will be necessary to have a mix of these types of skills to thoroughly evaluate the document.

Level (or amount) of User and Evaluator Interaction:  N/A

Data Recording Method(s):  The expert may use a table with the heuristics or focus statements listed to serve as a checklist. 

Total Testing Time Required:  The time required for each task reviewed would be dependent upon the complexity of the task; however, most would require time units measured in hours.

Typical Output from Test: Mainly objective data would be collected, but may include subjective comments which may be useful when considering further testing.

How to Run the Test Step-by-Step:

1)     Gather a group of 3 to 5 experts.

2)     Use people who give you good feedback – critical, but objective.

3)     Find experts who have knowledge on the subject to be evaluated.

4)     Provide them with the maintenance technician profile and environment.

5)     Determine if they need job aids or additional references so they can evaluate the total     package.

6)     Experts look over the task individually.  They should review the document at least twice considering all four following aspects of the task: technical (part numbers, tolerances, materials, tools); language (typos, grammar, clarity); procedural (sequencing, missing or unnecessary steps); graphics (wrong figure, incorrect drawing, inconsistency with text).

7)     You can provide the evaluators with a form on which to record feedback.

8)     Experts should provide individual written feedback.  A written report makes it easiest to digest and catalog but delays turn-around time.  In addition, you may want to convene as group meeting with all experts to collect unstructured comments, get more details and context, and can discover other problems that other experts didn’t discover.

Related Tests:  Also referred to as Usability Audit, Usability Inspection, Expert Review or Guidelines Review.

Required Testing Materials:

1)     A copy of the proposed documentation. 

2)     A set of instructions for each reviewer.

3)     Supporting materials such as the engineering blueprints.

4)     A heuristics or focused statements list as a reviewer aid.

5)     Space for each reviewer to individually evaluate the procedure documentation.

Cost to Conduct Test: A heuristic review of the technical documentation is relatively low.  The cost can be estimated by taking the hourly salary of each expert times the number of hours required for review, including a group meeting if required.  Also include your time in preparation for the review and recording the results.

Type of Documentation that Test Can Be Done On: Testing is most useful on complex procedures, but should be followed up with user performance evaluation after corrections are made.  Some questionable areas found in the heuristic review may be targeted for exploration during a user evaluation.

Goals of Testing: The goal of a heuristic review is to find errors in the text and illustrations at an early stage in the process of writing the procedural task.

Subjective or Objective Test: Heuristic Evaluation is subjective from the frame of the reviewer’s reference; however, using a common Heuristic or Focused Statements list aids objectivity of the evaluation.

Ease of Learning to Conduct the Test: Conducting a Heuristic Review is relatively easy if there are experts in the necessary fields available to evaluate the procedure.  It is advisable to have a Usability Professional as one of the experts to evaluate the documentation from a “user-centered” perspective.  Since the results do not require formal analyses, research experience is not required. 

Turnaround Time: The time to produce deliverable results from this method is short if there is agreement between experts as to the proposed changes. Compilation and implementation of the findings are necessary to progress to the next stage of development, so it is necessary to come to consensus on changes quickly.

Focus of Evaluation: This type of evaluation has a limited or narrow focus in that only the documentation is available for review.

Related Statistical Analysis: Results of the Heuristic Review are qualitative due to the low number of evaluators. 

Human Factors at NIAR | Human Factors at FAA | Human Factors Psychology at WSU

Human Factors Laboratory, National Institute for Aviation Research at Wichita State University. Research funded by the Federal Aviation Administration.  All rights reserved.
Revised: 11/05/04