Type Book
Date 2013
Pages 351
Tags human-computer interaction, nonfiction

Human-Computer Interaction: An Empirical Research Perspective

Chapter 1: Historical Context

HCI is the subfield of human factors focused on interaction with computers. It emereged in the 1980s as computers became increasingly common both in the workplace and at home.

MacKenzie briefly discusses Vannevar Bush's famous "As We May Think", as a way of showing that the need for usable interfaces for navigating the information age was anticipated quite a long time ago.

A knee-controlled pointing device is described--interesting. I've heard of some more recent similar controls intended for users with special needs.

The MVC paradigm was invented during the creation of the Xerox Star.

Also discussed:

  • Ivan Sutherland's Sketchpad
  • the Mouse
  • SIGCHI
  • Card, Moran, and Newell's The Psychology of Human-Computer Interaction
  • the Apple Macintosh
  • some (interesting!) examples of HCI research around menu navigation

Chapter 2: The Human Factor

Newell's Time Scale of Human Action

Newell proposes a logarithmic delineation of human action, spanning actions on the order of hundreds of microseconds up to actions on the order of months.

Scale (sec)Time UnitsSystemWorld (theory)
10^7MonthsSocial Band
10^6WeeksSocial Band
10^5DaysSocial Band
10^4HoursTaskRational Band
10^310 minTaskRational Band
10^2MinutesTaskRational Band
10^110 secUnit taskCognitive Band
10^01 secOperationsCognitive Band
10^-1100 msDeliberate actCognitive Band
10^-210 msNeural circuitBiological Band
10^-31 msNeuronBiological Band
10^-4100 μsOrganelleBiological Band

Research at the lower end of the scale is typically quantitative, while research at the upper end of the scale is typically qualitative.

Sensors

Vision (Sight)

Most people obtain about 80 percent of their information through sight.

The fovea image, the clear region suitable for reading or watching television, is about 1.1 degrees--roughly the width of the thumb at arm's length.

During vision, the eyes engage in two basic actions: fixations and saccades. During a fixation, the eye is stationary. Each fixation typically lasts at least 200 ms. During a saccade, the eye quickly changes position, usually taking 30-120 ms.

A sequence of fixations and saccades is called a scanpath.

Hearing (Audition)

A sound has (at least) four physical properties: intensity (loudness), frequency (pitch), timbre, and envelope.

Loudness is the subjective experience of intensity, quantified by sound pressure level and measured in decibels. Humans can hear sounds as quiet as 0-10 dB. Conversation is 50-70 dB, and sounds as loud as 120-140 dB cause pain.

Pitch is the subjective experience of frequency. Humans can perceive pitches ranging from about 20 Hz to 20 kHz, with the upper limit decreasing with age.

Timbre results from the harmonic structure of sound. For example, the timbre of a 400 Hz note depends on the relative amplitudes of the harmonics at 800 Hz, 1200 Hz, etc.

Envelope is the change in loudness of a note over time, from silent to audible to silent. The onset envelope or attack of a note is the note becoming audible. It is of substantial importance in the perception of sound.

Touch (Tactition)

All interactions involving touch offer tactile feedback--not merely explicitly produced feedback, but also the simple feeling of the objects in the hand.

An example is given of a mouse that can produce haptic feedback, such as when passing over a window border.

Smell (Olfaction) and Taste (Gustation)

These are typically not used in HCI, though some examples of research with smell are given.

Responders

Limbs

Between 8 and 15 percent of people are left-handed. However, handedness is a continuum.

Research

The book does not give data on how many people fall where on this continuum, which seems like a substantial oversight.

Voice

Besides verbal input (i.e. speech recognition), there is also the option of non-verbal voice interaction (NVVI), in which properties such as pitch, volume, or timbre are measured and used as input. Examples are given of NVVI for volume control and of a 'vocal joystick'.

Eyes

Eye tracking to simulate a mouse is discussed, called look-select by analogy with point-select.

Eye-typing is mentioned, though the input style displayed is simply a visual keyboard that responds to fixations. It would have been more interesting to see something more novel, like Dasher.

The brain

Perception

Pyschophysics examines the relationship between human perception and physical phenomena. A common experimental goal is to determine the just noticeable difference (JND) in a stimulus, the least amount of variance in some parameter of the stimulus which is detectable.

A number of examples of visual and auditory illusions are given:

Additionally, the phantom limb phenomenon is mentioned as a kind of haptic illusion.

A table is given of the time required for various stages in response to stimulus:

Operation Typical time (ms)
Sensory reception 1-38
Neural transmission to brain 2-100
Cognitive processing 70-300
Neural transmission to muscle 10-20
Muscle latency and activation 30-70
Total 113-528

Cognition

Cognition is conscious intellectual activity, such as thinking, reasoning, and deciding.

Since cognition occurs entirely within the brain, it is difficult to measure.

Memory

Discusses The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information.

Human performance

Reaction time

Reaction times vary according to the kind of stimulus:

Stimulus Reaction time
Auditory 150 ms
Visual 200 ms
Smell 300 ms
Pain 700 ms

Human error

A design that can lead to catastrophic outcomes purely on the basis of an operator's interaction error is a faulty design. (pp. 66--67)

Name Role
I. Scott MacKenzie Author
Morgan Kaufmann Publisher