Hick's law
Encyclopedia
Hick's Law, named after British psychologist William Edmund Hick, or the Hick–Hyman Law (for Ray Hyman
Ray Hyman
Ray Hyman is a Professor Emeritus of Psychology at the University of Oregon in Eugene, Oregon, and a noted critic of parapsychology.-Career:...

), describes the time it takes for a person to make a decision as a result of the possible choices he or she has. The Hick-Hyman Law assesses cognitive information capacity in choice reaction experiments. The amount of time taken to process a certain amount of bits in the Hick-Hyman Law is known as the rate of gain of information. Given n equally probable choices, the average reaction time T required to choose among them is approximately


where b is a constant that can be determined empirically by fitting a line to measured data.
Operation of logarithm here expresses depth of "choice tree" hierarchy. Basically log2 means that you perform binary search. According to Card, Moran, and Newell (1983), the +1 is "because there is uncertainty about whether to respond or not, as well as about which response to make." The law can be generalized in the case of choices with unequal probabilities pi of occurring, to


where H is the information-theoretic
Information theory
Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and...

 entropy of the decision, defined as


where pi refers to the probability of the ith alternative yielding the information-theoretic entropy.

Hick's Law is similar in form to Fitts's law. Intuitively, one can reason that Hick's Law has a logarithm
Logarithm
The logarithm of a number is the exponent by which another fixed value, the base, has to be raised to produce that number. For example, the logarithm of 1000 to base 10 is 3, because 1000 is 10 to the power 3: More generally, if x = by, then y is the logarithm of x to base b, and is written...

ic form because people subdivide the total collection of choices into categories, eliminating about half of the remaining choices at each step, rather than considering each and every choice one-by-one, requiring linear time.

Hick's Law is sometimes cited to justify menu
Menu (computing)
In computing and telecommunications, a menu is a list of commands presented to an operator by a computer or communications system. A menu is used in contrast to a command-line interface, where instructions to the computer are given in the form of commands .Choices given from a menu may be selected...

 design decisions (for an example, see http://www.catb.org/~esr/writings/taouu/html/ch04s03.html). However, applying the model to menus must be done with care. For example, to find a given word (e.g. the name of a command) in a randomly ordered word list (e.g. a menu), scanning of each word in the list is required, consuming linear time, so Hick's law does not apply. However, if the list is alphabetical and the user knows the name of the command, he or she may be able to use a subdividing strategy that works in logarithmic time.

For Hick's Law and Fitts's law considerations in the context of menu
Menu (computing)
In computing and telecommunications, a menu is a list of commands presented to an operator by a computer or communications system. A menu is used in contrast to a command-line interface, where instructions to the computer are given in the form of commands .Choices given from a menu may be selected...

 and submenu design, see Landauer and Nachbar (1985).

Background

In 1868, the relationship between having multiple stimuli and the choice reaction time was reported by Franciscus Donders
Franciscus Donders
-External links:* B. Theunissen. , F.C. Donders: turning refracting into science, @ History of science and scholarship in the Netherlands.* in the Virtual Laboratory of the Max Planck Institute for the History of Science* P. Eling, , Geneeskundige en fysioloog....

. Later, in 1885, J. Merkel discovered the response time is longer when a stimulus belongs to a large set rather than a smaller set of stimuli. At this point, psychologists began to see similarities between this phenomenon and the Information Theory. Hick first began experimenting with this theory in 1951. His first experiment involved 10 lamps with corresponding Morse Code keys. The lamps would light at random every five seconds. The choice reaction time was recorded with the number of choices ranging from 2–10 lamps. E. Roth (1964) could demonstrate a significant correlation between IQ and information processing speed, which is the reciprocal of the slope of the function
Reaction Time = Movement Time + log2(n) / Processing Speed


where ProcessingSpeed • log2(n) is the time taken to come to a decision and n is the number of choices.

Hick performed a second experiment using the same task, while keeping the number of alternatives steady at 10. The participant performed the task the first two times with the instruction to perform the task as accurately as possible. For the last task, the participant was asked to perform the task as quickly as possible.

While Hick was stating that the relationship between reaction time and the number of choices was logarithmic, Hyman wanted to better understand the relationship between the reaction time and the mean number of choices. In Hyman’s experiment he had eight different lights arranged in a 6x6 matrix. Each of these different lights was given a name, so the participant was timed in the amount of time it took to say the name of the light after it was lit. In further experimentation using this model, the number of each different type of light changed. Hyman was responsible for determining a linear relation between reaction time and the information transmitted.

Stimulus–response compatibility

The stimulus–response compatibility is known to also affect the choice reaction time for the Hick–Hyman Law. This means that the response should be similar to the stimulus itself. For example, turning a wheel to turn the wheels of the car is good stimulus–response compatibility. The action the user performs is similar to the response the driver receives from the car.

Original work

  • Hick, William E.; On the rate of gain of information. Quarterly Journal of Experimental Psychology, 4:11-26, 1952.
  • Hyman, Ray; Stimulus information as a determinant of reaction time. Journal of Experimental Psychology, 45:188-196, 1953.

Overviews

  • Card, Stuart K.; Moran, Thomas P.; Newell, A.
    Allen Newell
    Allen Newell was a researcher in computer science and cognitive psychology at the RAND corporation and at Carnegie Mellon University’s School of Computer Science, Tepper School of Business, and Department of Psychology...

    ; (1983); The Psychology of Human-Computer Interaction. Lawrence Erlbaum: Hilldale, London, 1983.
  • Cockburn, Andy , Carl Gutwin, Saul Greenberg; A predictive model of menu performance, Proceedings of the SIGCHI conference on Human factors in computing systems, April 28-May 3, 2007, San Jose, California, USA

  • Seow, Steven C. (2005). Information Theoretic Models of HCI: A Comparison of the Hick-Hyman Law and Fitts' Law. Human-Computer Interaction, 20 (3), 315-352. Retrieved April 3, 2009, from http://www.informaworld.com.proxy.lib.umich.edu/10.1207/s15327051hci2003_3
  • Welford, Alan T.; Fundamentals of Skill. Methuen, 1968. Pages 61–65.

External links

The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK