|
Welcome robot |
Feature | Volume 354, Number 9172 3 July 1999 |
The science of haptics gets in touch with prosthetics | ||
Lancet 1999; 354: 52 - 56 | Download PDF (160 Kb) |
Last month, US researchers led by John Chapin (MCP Hahnemann School of Medicine, Philadelphia, PA, USA) reported that for the first time they had used neuronal activity recorded directly from the brains of rats to control a robot in real time (Nat Neurosci 1999; 2: 664-70 [PubMed] ). Co-author Miguel Nicolelis (Duke University Medical Center, Durham, NC, USA) has extended the work to primates. These experiments have generated intense interest because they bring thought-controlled computers, mechanical devices, and prosthetic limbs a step closer to reality.
|
Nevertheless, it will be a long time before thought-controlled artificial limbs for people leave the realms of science fiction. And it will take considerable research before hand/arm amputees--of which there are 12 000 in the UK alone--can be provided with what they really need, namely a prosthesis that not only moves but also feels.
State-of-the-art prosthetic arms are advanced pieces of engineering. Last August, for example, a Scotsman who had had his arm amputated in 1986 was fitted with the world's first "bionic" arm by David Gow's team at the Princess Margaret Rose Hospital in Edinburgh, UK. But although the man can now tie his own shoelaces for the first time in 12 years, his ability to pick up objects is guided by sight, not by touch. Until we understand how the fingerpad detects surface characteristics and how information from the fingers is transformed by the brain into what we know as touch, we cannot construct an artificial hand with which to feel the world.
This is where the developing science of haptics should help. The term haptics--derived from the Greek haptikos meaning "able to touch"--was first coined by psychologists who were studying active touch in the early part of this century. In the past decade, the term has come to include all aspects of manual sensing and manipulation by both people and machines, and today a search on the internet for haptics yields almost exclusively sites that deal with virtual reality.
There is a lot of excitement right now in the field of haptics, says Mandayam Srinivasan, director of the Touch Lab at the Massachusetts Institute of Technology (Cambridge, MA, USA), because of a new generation of robotic machines that enable the user to touch and feel virtual objects generated by a computer. Nevertheless, current haptic devices are "mostly like feeling the world with a stick", he says. To really understand touch and to incorporate this knowledge into prosthetics and surgical simulators, three strands of research are essential, he continues: the study of human touch itself; the development of better machines that can provide a sense of touch; and the development of computer software for displaying realistic virtual objects.
So what do we know about human touch? Not much, admits Srinivasan. On a smooth surface a normal fingertip can feel a texture only 100 nm high. Researchers, he says, have a long way to go before they fully understand even this aspect of touch, yet the haptic image that the brain receives is the product of not only information from the fingerpad, but also "from sensors in the joints, tendons, and muscles".
All these processes must be understood before a prosthetic arm can be designed that truly replaces a skin-and-bone arm. Thus, although the bionic arm fitted last year is a masterpiece of engineering--controlled by its user via microswitches, microchips, motors, gears, and pressure points, and able to bend at the elbow, rotate and twist at the wrist, and grip using artificial contracting fingers--the device has no sensation of touch. "Trial and error and visual feedback is used to discern gripping force in the bionic arm", explains Gow. "For example, how much an object deforms is a good indicator of grip."
"We are working with colleagues to introduce automatic gripping sensors into the prosthesis to discern slippage of an object through the fingers. The electronics would automatically tighten the grip to compensate", Gow continues. But what he would really like to see is a better understanding of the electrical voltages and currents generated by nerve signals travelling in the neural system--neuroelectronics--"so that we can tap into nerve signals at a microscopic level" and eventually control grip through feedforward and feedback signals.
To achieve Gow's goal, there must be an efficient interface between the electronic systems of the prosthesis and its user's biological systems. "These devices need to be directly controlled by the brain and so must be linked with the human nervous system", explains Sue Bayliss, professor of advanced materials research at De Montfort University (Leicester, UK). She believes that porous silicone may one day be important at the interface between the biological system and the electronic system. This material is optoelectronic--it detects light and converts it into electrical impulses. Thus, if porous silicone were used to cover prosthetic devices, it could perhaps mimic touch. But, warns Bayliss, "this is a long way off".
What about the third strand of haptic research--display of virtual objects that can be touched and manipulated? Again, says Srinivasan, compared with the ideal of simulating natural touch, today's virtual-reality touch devices are limited. "When you touch something like your clothing, for example, there is a net force imposed on the skin. This force is distributed over your skin as a pressure field which we need to mimic for the virtual fabric to feel like the real one."
Srinivasan's team is working on the development of tactile displays that could be built into surgical simulators and into robotic systems for use in telesurgery. Srinivasan outlines what is needed for the surgeon to "virtually" feel and palpate an organ. "Within the finger tip [of the simulator], you may have to have 300 or 400 tiny stimulators, each less than a millimetre across, each controlled by the computer. If this can be done, then you can simulate any texture, shape, or softness. That would be called a tactile display." Eventually, he predicts, by combining such displays, "we will be able to better mimic direct touch", a development that should find uses in surgery, prosthetic design, and other spheres of human activity.
Karen Birchard