Enhancing the explanatory power of usability heuristics

@inproceedings{Nielsen1994EnhancingTE,
  title={Enhancing the explanatory power of usability heuristics},
  author={Jakob Nielsen},
  booktitle={International Conference on Human Factors in Computing Systems},
  year={1994}
}
  • J. Nielsen
  • Published in
    International Conference on…
    24 April 1994
  • Computer Science
Several published sets of usability heuristics were compared with a database of existing usability problems drawn from a variety of projects in order to determine what heuristics best explain actual usability problems. Based on a factor analysis of the explanations as well as an analysis of the heuristics providing the broadest explanatory coverage of the problems, a new set of nine heuristics were derived: visibility of system status, match between system and the real world, user control and… 

Tables from this paper

A Revised Set of Usability Heuristics for the Evaluation of Interactive Systems

The underlying goal is to include the most important er-gonomic criteria and usability heuristics into a clear hierarchical organization, which helps evaluators to better explain and developers to better understand the usability problems.

A Heuristic Evaluation Experiment to Validate the New Set of Usability Heuristics

This work has considered that the heuristics used until now, basically Nielsen's, do not cover all usability features for any interactive systems and it shows two real experiences to justify their use.

Heuristics for evaluating the usability of CAA applications

Using a corpus of usability problems within CAA, this paper reports on the development of domain specific heuristics and severity ratings for evaluating the usability of CAA applications.

Experimental Evaluation of Usability Heuristics

An empirical analysis about how the Nielsen's usability heuristics are perceived by novice evaluators is presented, finding that all the above mentioned constructs have an influence in the intention to use the heuristic in future usability evaluations, excepting the perceived clarity.

Extending the Usability of Heuristics for Design and Evaluation: Lead, Follow, and Get Out of the Way

User-centered design practitioners have often relied on discount usability engineering methods using heuristics. Top 10 lists of design and evaluation heuristics have proliferated during the 1990s,

Extending the Usability of Heuristics for Design and Evaluation: Lead, Follow, and Get Out of the Way

User-centered design practitioners have often relied on discount usability engineering methods using heuristics. Top 10 lists of design and evaluation heuristics have proliferated during the 1990s,

Evidence Based Design of Heuristics for Computer Assisted Assessment

An evidence based design approach to the development of domain specific heuristics is proposed and how this method was applied within the context of computer assisted assessment is shown.

Usability Pattern Identification through Heuristic

This article applies a usability inspection method to generate data that can be used as input for the systematic creation of usability patterns and illustrates the approach by means of a case study in the field of “linguistic annotation tools”.

Usability Evaluation

The International Organization for Standardization (ISO) defines Usability of a product as “the extent to which the product can be used by specified users to achieve specified goals with
...

References

SHOWING 1-10 OF 15 REFERENCES

An amalgamated model of software usability

  • Richard HolcombA. L. Tharp
  • Computer Science
    [1989] Proceedings of the Thirteenth Annual International Computer Software & Applications Conference
  • 1989
A basic model of software usability for human-computer interaction is developed to allow software designers to make quantitative decisions about which usability attributes should be included in a design, and to provide a usability metric by which software designs can be consistently rated and compared.

Heuristic evaluation of user interfaces

Four experiments showed that individual evaluators were mostly quite bad at doing heuristic evaluations and that they only found between 20 and 51% of the usability problems in the interfaces they evaluated.

What users say about software usability

This model of software usability for human‐computer interaction has two primary goals: to allow software designers to make quantitative decisions about which usability attributes should be included in a design, and to provide a usability metric by which software designs can be consistently rated and compared.

User interface evaluation in the real world: a comparison of four techniques

A user interface for a software product was evaluated prior to its release by four groups, each applying a different technique: heuristic evaluation, software guidelines, cognitive walkthroughs, and usability testing.

Heuristic evaluation

This chapter discusses heuristic evaluation, which Inspection of a prototype or finished system to identify all changes necessary to optimize human performance and preference.

Teaching user interface design based on usability engineering

The use of mandatory laboratory assignments is discussed in more detail and results from studies of students' motivation for taking the usability course and their evaluation of different aspects of the course are presented.

Getting around the task-artifact cycle: how to make claims and design by scenario

The approach leverages development practices of current HCI with methods and concepts to support a shift toward using broad and explicit design rationale to reify where in a design process, why the authors are there, and to guide reasoning about where they might go from there.

Designing the STAR User Interface

In April 1981, Xerox announced the 8010 Star Information System, a new personal computer designed for offices that is a multifunction system combining document creation, data processing, and electronic filing, mailing, and printing.

Improving a human-computer dialogue

A survey of seventy-seven highly motivated industrial designers and programmers indicates that the identification of specific, potential problems in a human-computer dialogue design is difficult.