a Middlesex University website - Info

IDC

IDC Major Themes

RIDL

Usability Evaluation: Overview

Usability Evaluation: People

Middlesex University

School of Computing Science

IDC convenor: Paul Curzon

Research in Digital Libraries (RIDL)


Usability Evaluation Techniques
for the Design of Interactive Digital Libraries

Project overview

This study is investigating the usability issues of digital libraries and the potential for known usability evaluation techniques to identify those issues. The relationship between the evaluation techniques and the specific usability issues provides the theoretical foundation for specifying a suite of usability techniques that designers can use to improve the usability of digital libraries.

Following the successful pilot study of the BT Digital Library, this study continues to use that environment to develop an understanding of the users and developers needs. The strategies of the expert intermediaries conducting a search utilising multiple iterations only served to highlight the difficulties faced by less skilled users who lacked strategies to take up the search refinement tools on offer. Significant modifications are being made to the usability evaluation methods, particularly to Claims Analysis, to incorporate an understanding of the information seeking task. The developers meantime find it difficult to know how to best support users and to predict the effect of changes made. The context within which the developers work is providing critical insight into their needs in relation to evaluation tools. Validating these methods with our collaborative partners at New Zealand Digital Library, and subsequently California Digital Library should help to ensure improved usability of both the interface and the evaluation methods.

Project report

Expert evaluation methods -This research program has investigated the use of established 'expert evaluation' methods such as heuristics, cognitive walk-through and claims analysis to support the design of the digital library interface. Usability of any interactive system is greatly enhanced by an iterative cycle of design and evaluation. Analytic methods do not replace the need for live user testing, but can help supplement the design process:

  • relatively quick and easy to apply
  • applied early in the design process when it is much easier to make significant changes to the design
  • support user involvement in collaborative design projects to engage in a systematic review of early proposals
  • ensure basic human-computer interaction guidelines are adhered to
  • identify 'intrinsic factors' affecting usability - such as support for the users goals, guessability, and error recovery

However while simple evaluation of the interface using heuristics can capture some usability issues in relation to general principles such as talking the users' language, feedback and consistency, the complexity of the task and domain issues have an important effect. Cognitive walk-through and claims analysis both make use of the user context and goals as a measure of progress and successful completion of an activity.


Usability issues - Human-computer interaction within a digital library is a particularly demanding context with novice users facing difficulties on at least three fronts:

  • lack of familiarity with the functionality of a particular system
  • lack of skills and general strategies to refine their information seeking requirements
  • lack of familiarity with the particular database or resource being used.

In addition, the information seeking task itself is difficult, especially in the early stages of researching a new topic when the user has difficulty expressing what the problem is and may change direction or focus as they learn more about the topic. Thus a fourth problem is lack of domain knowledge.


Case-study - Studying the use of the British Telecom Digital Library by experts and novices confirmed these problems variously identified in the information seeking literature. Initial interviews with users indicated a need for novel information in relation to new short cycle projects. These users were experts within their own domain but novice to the requirements of a specific project. As intermittent users they had little opportunity to build up their knowledge of the library and due to changes and upgrades to the interface remained permanent novices. They used few of the features - using simple keyword and phrase-based searching and perhaps registering in one of the browsable information monitoring topics.


The experts, in an in-depth knowledge elicitation study revealed sophisticated strategies for exploring the content of the resources and specifying the search. They made use of the descriptor terms provided by the abstract and index services (Inspec and ABI) and analysed by a feature they had designed called the keyword browser. This enabled them to make multiple cycles of query reformulation to first explore the vocabulary, then to expand the search and finally to narrow the results. Their success suggests a need for more support for this complex information seeking activity by linking skills and strategies to the functionality provided.

Claims analysis has been investigated in depth through out this project because the use of scenarios captures the context of the activity including a number of models of the information seeking process. The usability of claims analysis has been explored with the developers of the BT digital library. In a series of investigations of the developers design rationale, and evaluative strategies, a very simple framework was used to review on on-going design. Using a simple scenario, and walking through a sequence of activities the review team considered the users support for planning, action and understanding the feedback. This created an opportunity to reflect on the design, and for the developers to discuss the positive elements and possible negative consequences that needed further refinement.

Drawing on research on design process using use cases, scenarios and personas investigation has continued to identify simple strategies and templates for capturing the diversity of user needs and contexts identified through information seeking research. While research on design rationale and reflective design is cross-cut with research on cognitive models, situated action and information seeking process models in order to support the development of claims and the identification of positive and negative consequences for the user.

Prototype support tools and documentation is being validated with our collaborators at the New Zealand digital library project, and a student project with MODA. A tutorial has been accepted for presentation at JCDL 2003



Technical reports

  • TR1 Usability issues - working documents (not released)
  • TR2 Usability techniques - working documents (not released)
  • TR3 Exploring the unknown: a study of expert use of a digital library. Suzette Keith, Ann Blandford, Richard Butterworth, Bob Fields and Yin Leng Theng (2002)
  • TR4 An investigation into the application of Claims Analysis to evaluate usability of a digital library interface. Suzette Keith, Ann Blandford, Bob Fields and Yin Leng Theng (2002):
  • TR5 Tailoring Claims Analysis to the design and deployment of digital libraries: a case study. Suzette Keith, Ann Blandford and Bob Fields (2003)
  • TR6 Designing for Expert Information Finding Strategies. Bob Fields, Suzette Keith, Ann Blandford (2003)
  • TR7 Tutorial: Usability Evaluation of Digital Libraries (Abstract and refs) May 27th 2003 Houston Texas. JCDL 2003. Bob Fields, Suzette Keith and Ann Blandford (2003)



Publications

  • Keith S, Blandford A,Fields B and Theng YT (2002): An investigation into the application of Claims Analysis to evaluate usability of a digital library interface in Blandford A, & Buchanan G (2002) (Eds.) Proceedings of workshop on Usability of Digital Libraries at JCDL'02. Available from www.uclic.ucl.ac.uk/annb/DLUsability/JCDL02.html

People

The principal investigator:Dr Bob Fields
The researcher is Suzette Keith.

The co-investigators are Dr Richard Butterworth from Middlesex University, Dr Ann Blandford from University College London and Dr Yin Leng Theng, Nanyang Technological University, Singapore.

Our other collaborators include British Telecom plc, Prof Patricia Wright (Cardiff University), the California Digital Library (Dr John Ober) and the New Zealand Digital Library (Prof Ian Witten).

Acknowledgements
This three year research project is funded by EPSRC under the 2nd Distributed Information Management (DIM) programme, and commenced January 2001. This work is supported by EPSRC Grant No GR/N37858


Useful References

Information seeking is the term used to describe the human side of information retrieval. It is a complex activity for which a number of models have been proposed. For those thinking of researching in this area use the phrases information seeking, information retrieval, digital library. (librar* may capture both singular and plural. Phrases may need to be contained in "" although it all depends on the search engine preference settings…which is part of the problem!)

  1. Bates, M J (1989) The design of browsing and berrypicking techniques for the on-line interface. On-line Review 13 (5) 407-424
  2. Belkin, N J (1980) Anomalous states of knowledge as a basis for information retrieval. Canadian Journal of Information Science. 5. 133-134
  3. Borgman, C. (2000) From Gutenberg to the global information infrastructure. MIT Press.
  4. Covi L, Kling R (1996) Organizational dimensions of effective digital library use: closed rational and open natural systems model. J. American Society Information Science 47 (9) 672-689
  5. Ellis, D. & Haugan, M (1997) Modelling the information seeking patterns of engineers and research scientists in an industrial environment. J Documentation 53 (4) 384-403
  6. Ingwersen, P. (1996) Cognitive perspectives of information retrieval interaction: elements of a cognitive IR theory. J Documentation 52 (1) 3-50
  7. Kuhlthau, C.(1988) Longitudinal case studies of the information search process of users in libraries. Library and information science research 10 (3) 257-304
  8. Marchionini, G (1995) Information seeking in electronic environments. Cambridge University Press
  9. Nardi B , O'Day V L (1999) Information ecologies. Using technology with heart. MIT
  10. O'Day, V. L., & Jeffries, R. (1993). Orienteering in an Information Landscape: How Information Seekers Get From Here to There. In Proc. InterCHI '93, pp. 438-445.
  11. Sutcliffe, A. & Ennis, M. (1998) Towards a cognitive theory of information retrieval. Interacting with computers

Evaluation methods: Essential references for claims analysis are the works by Carroll and also Carroll and Rosson. Cooper is interesting in the way he looks at scenarios within the design process. Neilsen and Mack provide an essential introduction to usability inspection methods.

  1. Carroll J (2000) Making use: scenario based design of human computer interaction. MIT Press
  2. Carroll J M, Rosson MB (1992) Getting around the task-artifact cycle: how to make claims and design by scenario. ACM transactions on information systems. Vol 10 No 2 April 1992 181-212
  3. Carroll J M (1999) Five reasons for scenario based design. Proceedings of the 32nd Hawaii International conference on system sciences
  4. Cooper A (1999) The inmates are running the asylum. SAMS Indiana
  5. Nielsen J, Mack R(1994) Usability Inspection Methods. John Wiley and sons Inc
  6. Norman D A, (1986) Cognitive engineering. In User centred system design, D A Norman and S W Draper Eds Erlbaum Hillside, NJ 31-62
  7. Rosson M B, Carroll J M (2002) Usability engineering. Scenario based development of human-computer interaction. Academic Press
  8. Sutcliffe A G, Carroll J M (1999) Designing claims for reuse in interactive systems design. Int J Human-computer studies 50 213-241


Email: s.keith@mdx.ac.uk
Interaction Design Centre
School of Computing Science
Middlesex University
Trent Park
London N14 4YZ