To See Better, Take Things In Your Own Hands
Psychologists at Washington University in St. Louis, led by Richard A. Abrams, Ph.D., professor of psychology in Arts & Sciences, have shown that to see objects better, you should take the matter into your own hands.
Abrams’ study demonstrates that humans more thoroughly inspect objects when their hands are near the object rather than farther away from it. This reflexive, non-conscious difference in information processing exists, they posit, because humans need to be able to analyze objects near their hands, to figure out how to handle the objects or to provide protection against them.
Recognizing that the location of your hands influences what you see is a new insight into the wiring of the brain, one that could lead to rethinking current rehabilitative therapy techniques and prosthetic design.
For a stroke victim trying to regain use of a paralyzed hand, just placing the good hand next to the desired object could help the injured hand grasp it.
Likewise, prosthetics could be redesigned to include additional information flow from the hand to the brain, rather than just the brain controlling the spatial location of the prosthetic, as with today’s artificial limb technology.
The findings also may lend scientific support for recently enacted California legislation barring the use of hand-held cell phones while driving.
“Being able to have both hands on the wheel might enhance a driver’s perception of the wheel and the nearby instruments,” Abrams suggests. “If the car is perceived to be a type of extension of the wheel, then having both hands on the wheel might enhance the driver’s perception of the car’s location and of objects near to the car. So it is quite possible that there could be an unexpected benefit of having both hands on the wheel.”
Participants in the study were asked to search for the letters S or H among groups of letters displayed on a computer monitor. When they found the letter, the subjects responded by pressing one of two buttons, located either on the sides of the monitor or on their laps. The subjects’ search rate was slower when their hands were on the side of the monitor than on their laps, meaning that they were slower to turn their attention from one item to the next.
Abrams interprets the results to mean that there is an inherent mechanism in the human brain that forces us to move our attention, or “mind’s eye,” more slowly from one object to the next when the objects are near our hands. We are required to evaluate these objects more carefully because they are the most likely candidates for manipulation or possible harm.
“This is the first experiment to investigate the effect of hand position on response time for a visual search task,” said Abrams. “In all previous visual search experiments, subjects viewed stimuli on a display and responded by pressing buttons on a table, where their hands were far from the stimuli. In our experiment, the subjects responded using buttons attached to the display so that their hands were next to the stimuli.”
Response times from the hands-on monitor experiment were compared with those from a typical experiment where the subjects responded by pushing buttons that were far from the display, he added.
Abrams compares this new mode of information processing to the robotic arm on a space vehicle. The camera on the end of the arm sends an image to the operator about its surroundings, allowing the operator to guide the arm into position.
“The engineers who designed the arm knew that positioning it would be easier if they had the camera right in hand,” he said. “What we didn’t know until now was that humans have a mechanism for doing this, too.”
Cognition, Volume 107, Issue 3, June 2008, Pages 1035-1047
Altered vision near the hands
Richard A. Abrams, Christopher C. Davoli, Feng Du, William H. Knapp III, Daniel Paull
The present study explored the manner in which hand position may affect visual processing. We studied three classic visual attention tasks (visual search, inhibition of return, and attentional blink) during which the participants held their hands either near the stimulus display, or far from the display. Remarkably, the hands altered visual processing: people shifted their attention between items more slowly when their hands were near the display. The same results were observed for both visible and invisible hands. This enhancement in vision for objects near the hands reveals a mechanism that could facilitate the detailed evaluation of objects for potential manipulation, or the assessment of potentially dangerous objects for a defensive response.