Plastic skin gives sense of robot hands
Engineers in the US have created a prosthetic ‘skin’ with embedded circuitry that can send messages directly to the brain.
The plastic skin can detect how hard it is being pressed and generate an electric signal to deliver this sensory input to a living brain cell.
It is part of a Stanford University project to develop flexible electronic fabric embedded with sensors that cover prosthetic limbs and replicate some of skin's sensory functions.
A 17-person team led by Zhenan Bao, a professor of chemical engineering at Stanford, has taken an exciting step toward that goal - replicating the sensory mechanism that enables us to distinguish the pressure difference between a limp handshake and a firm grip.
“This is the first time a flexible, skin-like material has been able to detect pressure and also transmit a signal to a component of the nervous system,” said Bao.
The technology uses a two-ply plastic construct.
The top layer creates a sensing mechanism and the bottom layer acts as the circuit to transport electrical signals and translate them into biochemical stimuli compatible with nerve cells.
In the latest study, the top layer features a sensor that can detect pressure over the same range as human skin.
The team scattered billions of carbon nanotubes through plastic that was embedded with a ‘waffle’ pattern.
Putting pressure on the plastic squeezes the nanotubes closer together and enables them to conduct electricity.
This allowed the plastic sensor to mimic human skin, which transmits pressure information to the brain as short pulses of electricity, similar to Morse code.
Increasing pressure on the waffled nanotubes squeezes them even closer together, allowing more electricity to flow through the sensor, and those varied impulses are sent as short pulses to the sensing mechanism.
Remove pressure, and the flow of pulses relaxes, indicating light touch. Remove all pressure and the pulses cease entirely.
The team then hooked this pressure-sensing mechanism to the second ply of their artificial skin, a flexible electronic circuit that could carry pulses of electricity to nerve cells.
They then had to prove that the electronic signal could be recognised by a biological neuron.
They adapted a previous technique from an emerging field that combines genetics and optics, called optogenetics.
Researchers bioengineer cells to make them sensitive to specific frequencies of light, then use light pulses to switch cells, or the processes being carried on inside them, on and off.
The team engineered a line of neurons to simulate a portion of the human nervous system.
They then translated the electronic pressure signals from the artificial skin into light pulses, which activated the neurons, proving that the artificial skin could generate a sensory output compatible with nerve cells.
Optogenetics was only used as an experimental proof of concept, Bao said, and other methods of stimulating nerves are likely to be used in real prosthetic devices.
The team has already worked chemical engineers to show that direct stimulation of neurons with electrical pulses is possible.
They admit that there is a long way to go, as there are six types of biological sensing mechanisms in the human hand, and the experiment described in Stanford’s new Science report claims success in just one of them.
The team says its current two-ply approach means they can add sensations as it develops new mechanisms.