Type | Book |
---|---|
Date | 2013 |
Pages | 351 |
Tags | nonfiction, textbook, human-computer interaction |
HCI is the subfield of human factors focused on interaction with computers. It emereged in the 1980s as computers became increasingly common both in the workplace and at home.
MacKenzie briefly discusses Vannevar Bush's famous "As We May Think", as a way of showing that the need for usable interfaces for navigating the information age was anticipated quite a long time ago.
A knee-controlled pointing device is described–interesting. I've heard of some more recent similar controls intended for users with special needs.
The MVC paradigm was invented during the creation of the Xerox Star.
Also discussed:
Newell proposes a logarithmic delineation of human action, spanning actions on the order of hundreds of microseconds up to actions on the order of months.
Scale (sec) | Time Units | System | World (theory) |
---|---|---|---|
107 | Months | Social Band | |
106 | Weeks | Social Band | |
105 | Days | Social Band | |
104 | Hours | Task | Rational Band |
103 | 10 min | Task | Rational Band |
102 | Minutes | Task | Rational Band |
101 | 10 sec | Unit task | Cognitive Band |
100 | 1 sec | Operations | Cognitive Band |
10-1 | 100 ms | Deliberate act | Cognitive Band |
10-2 | 10 ms | Neural circuit | Biological Band |
10-3 | 1 ms | Neuron | Biological Band |
10-4 | 100 ฮผs | Organelle | Biological Band |
Research at the lower end of the scale is typically quantitative, while research at the upper end of the scale is typically qualitative.
Most people obtain about 80 percent of their information through sight.
The fovea image, the clear region suitable for reading or watching television, is about 1.1 degrees–roughly the width of the thumb at arm's length.
During vision, the eyes engage in two basic actions: fixations and saccades. During a fixation, the eye is stationary. Each fixation typically lasts at least 200 ms. During a saccade, the eye quickly changes position, usually taking 30-120 ms.
A sequence of fixations and saccades is called a scanpath.
A sound has (at least) four physical properties: intensity (loudness), frequency (pitch), timbre, and envelope.
Loudness is the subjective experience of intensity, quantified by sound pressure level and measured in decibels. Humans can hear sounds as quiet as 0-10 dB. Conversation is 50-70 dB, and sounds as loud as 120-140 dB cause pain.
Pitch is the subjective experience of frequency. Humans can perceive pitches ranging from about 20 Hz to 20 kHz, with the upper limit decreasing with age.
Timbre results from the harmonic structure of sound. For example, the timbre of a 400 Hz note depends on the relative amplitudes of the harmonics at 800 Hz, 1200 Hz, etc.
Envelope is the change in loudness of a note over time, from silent to audible to silent. The onset envelope or attack of a note is the note becoming audible. It is of substantial importance in the perception of sound.
All interactions involving touch offer tactile feedback–not merely explicitly produced feedback, but also the simple feeling of the objects in the hand.
An example is given of a mouse that can produce haptic feedback, such as when passing over a window border.
These are typically not used in HCI, though some examples of research with smell are given.
Between 8 and 15 percent of people are left-handed. However, handedness is a continuum.
Research
The book does not give data on how many people fall where on this continuum, which seems like a substantial oversight.
Besides verbal input (i.e. speech recognition), there is also the option of non-verbal voice interaction (NVVI), in which properties such as pitch, volume, or timbre are measured and used as input. Examples are given of NVVI for volume control and of a 'vocal joystick'.
Eye tracking to simulate a mouse is discussed, called look-select by analogy with point-select.
Eye-typing is mentioned, though the input style displayed is simply a visual keyboard that responds to fixations. It would have been more interesting to see something more novel, like Dasher.
Pyschophysics examines the relationship between human perception and physical phenomena. A common experimental goal is to determine the just noticeable difference (JND) in a stimulus, the least amount of variance in some parameter of the stimulus which is detectable.
A number of examples of visual and auditory illusions are given:
Additionally, the phantom limb phenomenon is mentioned as a kind of haptic illusion.
A table is given of the time required for various stages in response to stimulus:
Operation | Typical time (ms) |
---|---|
Sensory reception | 1-38 |
Neural transmission to brain | 2-100 |
Cognitive processing | 70-300 |
Neural transmission to muscle | 10-20 |
Muscle latency and activation | 30-70 |
Total | 113-528 |
Cognition is conscious intellectual activity, such as thinking, reasoning, and deciding.
Since cognition occurs entirely within the brain, it is difficult to measure.
Discusses The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information.
Reaction times vary according to the kind of stimulus:
Stimulus | Reaction time |
---|---|
Auditory | 150 ms |
Visual | 200 ms |
Smell | 300 ms |
Pain | 700 ms |
Latency is important for quick and accurate control. VR is discussed.
Name | Role |
---|---|
I. Scott MacKenzie | Author |
Morgan Kaufmann | Publisher |
Relation | Sources |
---|---|
Why? |