Technology
toolbar
www.buy.com is an IBM e-business
E-mail this article Print this article Sponsored by  

 

November 9, 2000
WHAT'S NEXT

Seeking Computers That Can Feel

By ROBERT HERCZ
DOCTORS have always relied on their sense of touch, for everything from palpating a lump to tightening a suture. But in recent years, technology has been dividing doctors from their patients. Minimally invasive (or "keyhole") surgery, in which cameras and surgical instruments are threaded through tiny openings, may drastically reduce patient trauma and recovery time, but it also eliminates the direct contact with organs, bones and muscles that doctors have enjoyed in open surgery. 

What technology has taken away, however, technology has begun to give back. Haptics, from the Greek verb meaning "to touch," is the science of incorporating the sense of feel into computer interfaces. For decades, touch has been recognized as the next step, after sight and sound, toward a natural, all-involving computer environment, but it is only recently that computers have become fast enough and cheap enough to make haptic interfaces viable. 

As a result, commercial haptic devices have started to appear, not only in niche markets like computer-aided design and 3-D modeling, but on the consumer desktop: The iFeel mouse by Logitech has a small motor that kicks or vibrates as users roll over menus, icons and window boundaries. Some experts predict that haptics will soon be as familiar a part of the computer desktop as color graphics and stereo sound are today. 

But nowhere is the promise more exciting than in medicine. Simple haptic interfaces are already part of some medical equipment, and researchers are working toward the ultimate goal, to make even the robotic surgery of the future, where doctor and patient might be continents apart, feel (to the surgeon, anyway) like the traditional open surgery of the past. 

At M.I.T.'s laboratory for Human and Machine Haptics, Dr. Mandayam Srinivasan, the lab's founder, is developing a simulator to show medical students how a correctly inserted needle feels when injecting a spinal anesthetic. Finding the correct spot below layers of skin and ligament is a delicate procedure, and a misplaced needle can paralyze a patient. 

"Nowadays it is done with expert supervision, but still the trainee does it for the first time on a patient," Dr. Srinivasan said. "It seems to me that having a simulator, even if it is not perfectly realistic, would still be beneficial." 

The simulator consists of a syringe attached to a desktop force-feedback device — a freely movable stick equipped with motors so it can push back — and a computer running the simulation software, and a mannequin back. With a simulator, students can experience the full spectrum of cases, not just the usual ones. 

"You can simulate scenarios that occur in 1 in 10,000 patients," Dr. Srinivasan said. "I have heard anesthesiologists say that there are things that they didn't encounter even after they had done a thousand epidural procedures." 

Greg Merrill, the founder of HT Medical Systems of Gaithersburg, Md., said that 75 percent of complications arising from procedures like intravenous catheterization occur in the first 30 cases of a doctor's career. It's a statistic that speaks eloquently for his firm's medical training simulators, like CathSim, which teaches how to insert catheters and needles. The device consists of a small box with a protruding force-feedback syringe. The device's haptic feedback includes the "pops" of the needle piercing the skin and vein. Different software modules simulate adult, child and geriatric profiles, including, for example, the tough, scarred veins of an intravenous drug user. 

Future generations of simulators, Mr. Merrill said, will use data from individual patients. "You can have the patient come in, get scanned, and the physician can do patient-specific pre-operative rehearsal," Mr. Merrill said. "That's of tremendous benefit in learning the best approach to a procedure, figuring out which medical devices work best for that patient."

For example, stents, tiny tubes inserted into diseased arteries to keep them open after balloon angioplasty, have to be carefully matched to each patient's particular arterial geometry. A rigid stent cannot easily be threaded into a tightly twisted artery. Today, surgeons use trial and error; if one stent can't be deployed, they will pull it out and try another. 

"Wouldn't it be nice to determine which is the appropriate product prior to trying it out on the patient?" Mr. Merrill asked. "The advantage of haptic feedback is you can feel the resistance that it's going to take to deploy that stent within that patient-specific anatomy." 

Haptics may be a hot commodity in the world of medical simulation, but it has barely entered the real world of the operating theater. 

"It's probably the one thing that's held back minimally invasive surgery the most," Mr. Merrill said. "The doctor is moving further and further from the site of the interaction, and the problem is they lose the sense of touch. The number of procedures making the transition from open to minimally invasive is not as big as it probably should be for that reason." 

One problem is that the haptic interfaces available today are all sticklike force-feedback devices. They are useful, but feeling the world with a stick doesn't begin to convey the experience of holding something in your hands. What is missing is something like artificial skin — shaped like a glove, perhaps, or at the very least, a thimble — that will deliver a virtual approximation of the rich tactile experience we take for granted. 

At Stanford University's Dextrous Manipulation Lab, Dr. Mark Cutkosky is working on such a device, called CyberGlove. The glove controls a two-fingered robotic arm — whatever the user does, the robot does — which in turn feeds back what it "feels" to the glove's fingers. 

"There's a funny thing that happens when you provide feedback to the user," Dr. Cutkosky said. "Suddenly, it no longer feels like, I'm here with my glove and I'm controlling that robot hand over there. Suddenly you feel like, that's my hand over there, it's an extension of me." 

Although the CyberGlove is about as advanced as haptics gets today, it is still a relatively primitive instrument that feeds back force (Dr. Cutkosky is experimenting with adding vibration) to just a single haptic unit on each finger. That's a far cry from the detail that fingertips are capable of resolving. 

In some ways, the sense of touch is more complex than vision or hearing. To satisfy the eye that an image is moving, for example, it is enough to display 15 still pictures a second. The haptic equivalent — fooling the fingertip into believing it is feeling a surface — takes a thousand impulses a second. In addition, while eyes respond exclusively to light, fingertips respond to force, vibration and temperature. Between today's devices and a robot surgeon that feels like a natural extension of a human lie years of research and development. 

"I am always amazed at how people take touch for granted," Dr. Srinivasan said. "It's an amazing system. No wonder we haven't been able to build robots that are comparable to even a 2-year-old human baby." 

 
E-mail this article Print this article Sponsored by  



Ask questions about Consumer Electronics, the Web, Technology News and more. Get answers and tell other readers what you know in Abuzz, new from The New York Times.  
www.buy.com is an IBM e-business
Home | Site Index | Site Search | Forums | Archives | Shopping
News | Business | International | National | New York Region | NYT Front Page | Obituaries | Politics | Quick News | Sports | Science | Technology/Internet | Weather
Editorial | Op-Ed
Features | Arts | Automobiles | Books | Cartoons | Crossword | Games | Job Market | Living | Magazine | Real Estate | Travel | Week in Review
Help/Feedback | Classifieds | Services | New York Today
Copyright 2000 The New York Times Company