Steven Hsiao |
Increasingly sophisticated artificial limbs are being developed that allow users a startlingly lifelike range of motion and fine motor control. Johns Hopkins neuroscientist Steven Hsiao, however, is not satisfied that a prosthetic limb simply allows its user to grasp or move something. He wants to provide the user the ability to feel what the artificial limb is touching, such as the texture and shape of a quarter, or experience the comforting perception of holding hands. Accomplishing these goals requires understanding how the brain processes the multitude of sensations that come in daily through our fingers and hands.
Using a $600,000 grant administered through the federal stimulus package, Hsiao is leading a team that is working to decode those sensations, which could lead to the development of truly “bionic” hands and arms that use sensitive electronics to activate neurons in the touch centers of the cerebral cortex.
“The truth is, it is still a huge mystery how we humans use our hands to move about in the world and interact with our environment,” Hsiao, of the university’s Zanvyl Krieger Mind/Brain Institute, stated in a press release. “How we reach into our pockets and grab our car keys or some change without looking requires that the brain analyze the inputs from our hands and extract information about the size, shape and texture of objects. How the brain accomplishes this amazing feat is what we want to find out and understand.”
Hsiao hypothesizes that our brains do this by transforming the inputs from receptors in our fingers and hands into “neural code” that the brain then matches against a stored, central “databank” of memories of those objects. When a match occurs, the brain is able to perceive and recognize what the hand is feeling, experiencing and doing.
In recent studies, Hsiao’s team found that neurons in the area of the brain that respond to touch are able to “code for” (understand) the orientation of bars pressed against the skin, the speed and direction of motion, and curved edges of objects. In their study, Hsiao’s team will investigate the detailed neural codes for more complex shapes, and will delve into how the perception of motion in the visual system is integrated with the perception of tactile motion.
The team will do this by first investigating how complex shapes are processed in the somatosensory cortex and second, by studying the responses of individual neurons in an area that has traditionally been associated with visual motion but appears to also have neurons that respond to tactile motion.
“The practical goal of all of this is to find ways to restore normal sensory function to patients whose hands have been damaged, or to amputees with prosthetic or robotic arms and hands,” Hsiao stated. “It would be fantastic if we could use electric stimulation to activate the same brain pathways and neural codes that are normally used in the brain. I believe that these neural coding studies will provide a basic understanding of how signals should be fed back into the brain to produce the rich percepts that we normally receive from our hands.”