[AI] Robots with skin enter our touchy-feely world

Sanjay ilovecold at gmail.com
Thu Aug 19 06:40:37 EDT 2010


          If humanoid robots are ever to move among us, they will first need
          to get in touch with the world - and learn to interpret our fuzzy
          human language

by Paul Marks

BEAUTY may be only skin deep, but for humanoid robots a fleshy
covering is about more than mere aesthetics, it could be essential to
making them socially acceptable. A touch-sensitive coating could
prevent such machines from accidentally injuring anybody within their
reach.

In May, a team at the Italian Institute of Technology (IIT) in
Genoa will dispatch to labs across Europe the first pieces of
touch-sensing skin designed for their nascent humanoid robot, the
iCub. The skin IIT and its partners have developed contains
flexible pressure sensors that aim to put robots in touch with the
world.

"Skin has been one of the big missing technologies for humanoid
robots," says roboticist Giorgio Metta at IIT. One goal of making
robots in a humanoid form is to let them interact closely with
people. But that will only be possible if a robot is fully aware of
what its powerful motorised limbs are in contact with.
Skin has been one of the big missing technologies in the designing of
humanoid robots

Roboticists are trying a great variety of ways to make a sensing skin.
Early examples, such as the CB2 robot, built at Osaka University in
Japan, placed a few hundred sensors in silicone skin. But now "many,
many sensing methods are emerging", says Richard Walker of Shadow
Robot, London. Until a lot of robots are using them, it is going to be
hard to say which are best suited for particular applications.

What's more, there are many criteria the skin has to meet, says Metta:
it must be resilient, able to cover a large surface area and be able
to detect even light touches anywhere on that surface. "Many of these
factors conflict with each other," he says.

The iCub is a humanoid robot the size of a child of three-and-a-half
years old. Funded by the European Commission, it was designed to
investigate cognition and how awareness of our limbs, muscles, tendons
and tactile environment fuels the development of intelligence. The
iCub's technical specifications are open-source and some 15 labs
across Europe have already "cloned" their own, so IIT's skin design
could find plenty of robots to enwrap.

The skin is made up of triangular, flexible printed circuit boards
which act as sensors, and it covers much of iCub Movie Camera 's
body. Each bendy triangle is 3 centimetres to a side and contains 12
capacitive copper contacts (pictured). A layer of silicone rubber acts
as a spacer between those boards and an outer layer of Lycra that
carries a metal contact above each copper contact. The Lycra layer and
flexible circuits constitute the two sides of the skin's
pressure-sensing capacitors. This arrangement allows 12 "tactile
pixels" - or taxels - to be sensed per triangle. This taxel resolution
is enough to recognise patterns such as a hand grasping the robot's
arm. The skin can detect a touch as light as 1 gram across each taxel,
says Metta. It is also peppered with semiconductor-based temperature
sensors. This version of the skin will be released in May.

Later, IIT plans to add a layer of a piezoelectric polymer called PVDF
to the skin. While the capacitance sensors measure absolute pressure,
the voltage produced by PVDF as a result of its deformation when
touched can be used to measure the rate of change of pressure. So if
the robot runs its fingertip along a surface, the vibrations generated
by friction give it clues about what that surface is made of. Such
sensitivity might help it establish the level of grip needed to pick
up, say, a slippery porcelain plate.

Philip Taysom, CEO of British company Peratech of Richmond, North
Yorkshire, is not a fan of sensing skins based on capacitors, which he
says can lose sensitivity with repeated use. Peratech's answer is a
stretchy, elastic material it calls quantum tunnelling composite
(QTC). This comprises a polymer such as silicone rubber that is
heavily loaded with spiky nickel nanoparticles. A voltage is applied
across the skin, and when it is pressed, the distance between the
nanoparticles within the polymer diminishes, which results in
electrons flowing, or "tunnelling", from one nanoparticle spike to the
next in the area being touched. Crucially, the material's electrical
resistance drops dramatically and in proportion to the force applied,
so the touch can be interpreted.

At the Massachusetts Institute of Technology's Media Lab, Adam
Whiton is developing a QTC-based sensing skin for a commercial
robot-maker which he declines to name. Instead of a tight, conforming
skin, Whiton uses a looser covering, more akin to clothing. "We cover
ourselves with textiles when we interact with people, so clothing may
be a better metaphor as a humanoid's pressure-sensitive surface
covering," he says.

Natural gestures, like tapping a humanoid on the back to get its
attention, or leading it by the arm, can be easily interpreted because
QTC boasts high sensitivity, he says. But novel skin capabilities
could be on the way, too. For example, QTC can also act as an
electronic nose. Careful choice of the material's base polymer, says
Taysom, means telltale resistance changes can be induced by reactions
between volatile chemicals in the air - so it can become an e-nose as
well as a touch sensor, able to detect, for example, a gas leak in
your home. "This shows we can probably build into robots a lot of
things that our skin can't do. It's another reason not to stick
rigidly to the human skin metaphor," says Whiton.

That's not to say our skin isn't a great influence. Shadow Robot will
soon start testing a novel human-like touch-sensing fingertip from
Syntouch, a start-up based in California. Its fingertip comprises
a rubbery fluid-filled sac that squishes just like a real fingertip,
and is equipped with internal sensors that measure vibration,
temperature and pressure.

Whichever of the emerging technologies prevail, sensing robot skins
should help us get along with our future humanoid assistants, says
Whiton. "Right now, robots are about as friendly as photocopiers. The
interactions skins encourage will make them much friendlier."

Parlez-vous robot?
If you've ever tried to direct a lost tourist to their intended
destination, you'll know how difficult directing someone that doesn't
speak your language can be. Directing robots presents a related
challenge.

Typically, robots respond well to precise instruction sets but they
are flummoxed if their instructions are given in the fuzzy, everyday
language so beloved by humans. Now a team at the University of
Washington in Seattle have developed translation software which could
enable robots to understand a set of natural-language directions. The
technology could make it easier to control robots in situations like
search and rescue, where it can be preferable to send a robot rather
than a human.

Cynthia Matuszek and her colleagues used the principles of
machine translation - commonly used online to translate text of
one language into another - to develop a navigation program for
robots. Machine translation tools are designed to learn from previous
efforts, improving their accuracy through experience.

The team first sent a small mobile robot to explore and map portions
of two buildings on campus. The researchers then generated random
paths through the maps and asked human volunteers to annotate the
routes with natural commands, such as "turn right", or "take the
second left" that would have led to successful completion of each
path.

Matuszek used these maps to train the navigation program, which
learned to associate the various human commands with specific types of
route-finding behaviour.

The navigation program was then run on a virtual robot, which was
given natural-language directions for a variety of previously unknown
routes through the maps. The virtual robot was able to successfully
complete 10 of the 14 direction sets on its first attempt. The results
were presented at the International Conference on Human-Robot
Interaction in Osaka, Japan, in March.

"I'm glad to see work that is getting back to the original dreams of
the field, like having a robot that you can talk with naturally," says
Ray Mooney, a machine-translation researcher at the University of
Texas in Austin.

He says that previous attempts to give robots instructions have
favoured explicit rules for sentence structures, semantics and syntax.
He says this "traditional" approach is labour intensive and requires
strictly defined commands, making it cumbersome in emergency
situations where many robots are deployed.


Technical telepathy: 09969636745
Saints are not always saints; sinners are not always sinners.
  





More information about the AccessIndia mailing list