Recently, a neural implant allowed a person to write real text with their thoughts alone, but a new implant-controlled robotic arm is making waves, literally sending tactile feedback to the user with a second implant, according to a recent study published in the journal Science.
A robotic arm with a sense of touch drastically improves efficiency
Typically, when we want to hoist an object up with our hands, we use vision to locate it. Afterward, our other senses take control. A little-known sense called proprioception aids our situational awareness, to keep us in the know about which body parts are where. Then our sense of touch informs us of the degree of firmness with which we have grabbed an object, reducing the visual role of our senses to a secondary position.
However, early robotic arms demanded we use visual perception for the entire process of interacting with the world. With vision alone, judging a firm grip is a matter of estimation. It’s better than total paralysis, but the visual-only method costs users far more attention than the ability to “feel” what we’re doing with touch and proprioception. Lucky for us, scientists have vastly advanced our grasp of which regions of the brain process information transferred by sensory nerve cells in our hands. And the new research involved two electrode arrays implanted into the section of the brain designed to process information coming from the skin. When the electrodes activate (there are 32 of them), the brain experiences the sensation of something manipulating their palm and fingers.
The recent study involved a participant who was paralyzed from the neck down, and had worked with a robotic arm for two years via the brain implants placed within the motor-control region of his brain. The man could already capably employ the robotic arm, despite lacking sensation many of us take for granted. But in the recent experiments, the research team alternated tests, wherein some tests added tactile feedback, and others shut this add-on system off. The tests generally involved grasping variously shaped objects, carrying them somewhere, then releasing them to drop.
More tests could see touch-capable robotic arms serve patients in need
Numerous tests showed a correlation: possessing the sense of touch dramatically improved performance. In executing the series of tasks nine times with the touch system deactivated, the man could complete the same series more than a dozen times with the system on. The most crucial advantage came from the participant’s ability to grasp the object. For example, the time between touching the object with the robotic arm and lifting it up from a table fell by two-thirds with the feedback feature turned on. But when it wasn’t, he took more time adjusting the position of the robotic hand, to achieve an effective grip before continuing.
While the results are extremely promising, the recent study itself is still an early-stage case involving only one participant, which means we need more tests to flesh out how the system works. But the results feel at least intuitively true to experience: we generally don’t need to reserve our attention for touching and grasping things as we move through the world, but without these senses, our engagement with the world would be much, much slower. With additional tests and development, technology like this may eventually be rolled out into mainstream medical use for patients in need around the world.