|
Although the concept of virtual reality
has been around for years, haptic communication, or touch, is often left
out of the cyberspace experience.
Researchers at the Massachusetts Institute of Technology have found that
haptic communication can make it easier to carry out simple, cooperative
tasks in virtual worlds.
The researchers used a stick-like control device that allowed their human
subjects to feel virtual objects in order to test their behavior in a
shared haptic environment.
The subjects, seated at a computer with a control device, cooperated with
a researcher who was hidden in another room in order to perform a task
in a virtual environment. The task, stringing a virtual ring along a virtual
bent wire without touching the wire, was impossible for either person
to achieve alone. The subjects had no knowledge of the person they were
cooperating with.
The test took place in two modes -- visual only, in which the cooperating
pair saw the wire and ring on their respective computer screens, and visual
and haptic, in which the pair also received touch feedback. The robotic
control stick allowed the subjects to feel the ring as if it was near
the end of the stick. Using the stick, they could push and pull the ring,
and feel forces generated by the remote partner pushing and pulling the
same ring.
"If you hold your pen and you touch the table and close your eyes, the
resistive force that you feel [that's what] you can feel through this
tool," said Mandayam Srinivasan, a principal research scientist of mechanical
and electrical engineering at the Massachusetts Institute of Technology,
and director of the school's Laboratory for Human-Machine Haptics. Virtual
objects can have any shape, texture, friction and softness, he added.
"I can basically simulate any of the [haptic] properties that you normally
[experience] around you," with the touch tool.
In general, the experiment showed that the haptic feedback both improved
the subjects' performance of a task and added a feeling of togetherness
between the remote participants, said Srinivasan.
"One of the things we were trying to see was whether having touch increases
the sense of presence. That is, does it feel like you are doing the task
with another subject, or does it feel like you were just doing it with
a computer. Most people felt that when they had [the haptic feedback]
that they did feel more like they were working with another person," said
Srinivasan.
The ultimate goal of the research is to eventually enable people who rely
on touch, like designers, to work together over the Internet, Srinivasan
said. "Suppose we have two automobile designers -- one in Detroit and
one in Japan. And suppose through the Net they can not only visually have
a 3-D view of various parts, but suppose they're able to touch and manipulate
them and see how they fit," he said.
The researchers experiment included two groups of subjects. One group
initially used visual and haptic feedback, then ten days later repeated
the same trials using visual feedback only. The other group did the test
on the reverse order, first performing with only visual cues, then visual
and haptic trials. The group of subjects who cooperated visually first,
then used vision plus haptic feedback performed higher than the group
that did trials the other way around.
In querying the subjects after the experiment, the researchers also asked
them to guess the gender of the remote partner for each of the two sessions.
There was a surprising and sharp distinction in the results, said Srinivasan.
Even though the remote partner stayed the same throughout, the subjects
strongly associated the haptic feedback with the male gender and the visual
only feedback with the female gender, said Srinivasan. During the visual
and haptic portion of the test, 90 percent of the subjects perceived that
they were collaborating with a male, while 70 percent of the subjects
perceived they were collaborating with a female during the visual only
version.
The study was not big enough for this result to be statistically significant,
Srinivasan said. "We didn't have a lot of subjects -- it was just a finding.
To have a statistical significance we need to repeat it for hundreds of
people and do an epidemiological study to find out why. But it was an
interesting finding," he said.
In the end, the paper shows it is possible to draw conclusions about using
touch in a collaborative environment, said Srinivasan. "The point is to
show in at least a limited context where we have forced [people] to collaborate
in a particular task, what the implications are in terms of sense of presence,"
he said. "Now we have the capability to... find out how to fine-tune a
collaborative workspace where you have not only vision and sound, but
touch as well."
The study is exploring new territory, said Jie Yang, a research computer
scientist at Carnegie-Mellon University. "To my knowledge, studying the
role of touch in a shared cultural space is novel. Touch could be a quick
feedback to a user for some applications [and] could improve collaborations
for some tasks and shared working space," he said.
The MIT researchers are now working on a similar project that will have
people cooperating using touch over the Internet between MIT and University
College London in England. The original study was done by connecting two
monitors and haptic devices in separate rooms to a single computer to
avoid any delays in the haptic feedback, said Srinivasan.
One of the challenges in the new study, dubbed Touch Across the Atlantic,
is how to deal with the inevitable time delays, Srinivasan said. "It depends
on the protocols we use... but [the haptic delays will] probably be several
hundred milliseconds," which is is a perceptible delay, he said.
Haptic information in general could eventually be used to teach tricky
physical tasks like surgery, Srinivasan said. It also has the potential
to allow humans to absorb complex data sets more quickly, he said. Using
the sense of touch as well as vision and sound, "[you] can pump more information
through your brain, because you're using more channels,” he said.
Given a three-dimensional data set, for example, vision alone can do quite
well. "But now we have data sets that are more than three-dimensional.
So in these things it may be possible to convey some of the extra [data]
through forces on your hand -- maybe you can understand [complicated interactions
like] airflow over an aircraft wing... or how the various components make
up a particular gene."
Srinivasan's research colleagues were Cagatay Basdogan and Chih-hao Ho
of the Massachusetts Institute of Technology (MIT) and Mel Slater of University
College London in England. They published the research in the December,
2000 issue of ACM Transactions on Computer-Human Interaction. The research
was funded by The Office of Naval Research.
Timeline: Now
Funding: Government
TRN Categories: Human-Computer Interaction
Story Type: News
Related Elements: Technical paper, "An Experimental Study
on the Role of Touch in Shared Virtual Environments," ACM Transactions
on Computer-Human Interaction, December, 2000.
|
|