The bionic hand closes slowly. Its slender metal digits whirr as they jitter into a loose fist, as though they are wrapping around an invisible baton. “OK, closed,” says the test subject.
The test subject is Amanda Kitts. In 2006, a Ford F350 hit her Mercedes sedan head-on. The collision rent the truck’s tire from its chassis and shoved the axle into Kitts’ car, where it nearly severed her arm. “It wasn’t completely off, but it was mincemeat,” she says. “There was no saving it. So the surgeons pretty much cut it straight off, like you would a piece of wood.”
More than a decade later, she’s sitting in a white-walled room inside Cleveland Clinic medical center controlling a bionic hand via a thermoplastic prosthetic socket, which wraps around her upper body. It envelops what remains of her left arm, which ends a few inches below her shoulder. There, nerves that once commanded her hand and forearm—remnants of her amputation—now innervate her biceps and triceps muscles. The socket relays electrical signals from those muscles to the computerized, motor-driven hand, which she operates with her mind. I want to close my hand, she thinks, and the hand complies.
“You can provide someone a feeling of a hand movement, and a sense of authorship over that movement.”
But what’s impressive here isn’t that Kitts can manipulate the hand. Motorized prostheses have improved dramatically in recent years, providing increasingly sophisticated options for restoring dexterous movement.
What’s remarkable is that she knows what her bionic appendage is up to, in spite of her blindfold and the noise-cancelling headphones that cover her ears. Kitts can feel the hand’s movement, sense its position in space, and it’s unlike anything she’s ever experienced. “Being able to close the hand and feeling that it’s closed and knowing that it’s closed. That’s what’s amazing,” she says. As the robotic fist loosens, she mirrors its conformation with her right hand. “Open,” she says, when its fingers—which feel like her fingers—reach their full extension.
Neurophysiologists call awareness of the movement and position of one’s body parts kinesthesia. (The more general term is proprioception, though it refers more to position than movement.) When an able-bodied person moves her hand, sensorimotor signals inform her brain where and how it’s moving. Kinesthesia is what lets her seize a falling bottle of shampoo in the shower, or shoulder her backpack with her right hand while staring at the phone in her left. The sensation is entirely distinct from touch, yet kinesthesia is equally if not more important for complex motor tasks.
But today, even the most sophisticated prosthetic hands provide no kinesthetic feedback. “That means the only way to know where your prosthesis is, is to watch it,” says Paul Marasco, a neuroscientist and—deep breath—sensorineural physiologist at the Cleveland Clinic. An upper limb amputee does not simply open a door. Rather, he sees the door handle. Watches his prosthetic hand reach for the handle. Watches his prosthesis grasp the handle. Turn the handle. Pull the handle. And so on.
But in the latest issue of Science Translational Medicine, researchers led by Marasco describe a neural interface that mimics kinesthetic feedback via a prosthesis, restoring the sense in test subjects with upper limb amputations—test subjects like Kitt. The technique enables patients like her to not only improve their control of a robotic hand, but perceive its intricate movements—wrapping fingers around an invisible cylinder, for example. “We’ve tapped into people’s perceptual integration system,” Marasco says.
The method hinges on an extraordinary phenomenon that neurophysiologists call vibration-induced kinesthetic illusions: Vibrating a tendon at a frequency between 70 and 115 Hz makes you feel like its associated joint is moving. The illusions can involve multiple joints, and are potent enough to fool people into sensing that their arms are bending into weird, or even impossible, shapes. They can also implicate other body parts.
Take the Pinocchio illusion: Applying vibration to the tendon of your biceps muscle while you pinch the tip of your nose will convince you that your elbow joint is extending and drawing your hand away from your face. But the brain likes a clean story. So when you sense the joint extending, you sense your nose extending with it. “These illusions, they’re incredibly powerful,” Marasco says. “They can override your sense of what’s real.”
Marasco and his colleagues wanted to know whether they could make a bionic limb feel as real as a biological one.
To find out, they first applied vibrations to the biceps, triceps, and pectoralis muscles of six test subjects—sites where nerves that previously led to the subjects’ lower arms had been reattached—and asked them to mirror the perceived movement of their missing hands with their remaining ones.
What happened next astonished the researchers. They’d expected their test subjects to feel movements at individual fingers or joints. “Instead, we got these highly synergistic grip conformations,” Marasco says. Test subjects’ missing hands assumed a variety of positions—called “percepts”—that involved all their fingers moving in a concerted way. Marasco’s team recorded 22 distinct percepts in total, the most highly conserved between test subjects being the loose fist—aka “cylinder” grip—previously described. Other common percepts included the “tripod” grip,” in which the tips of the thumb, middle, and forefinger unite, and the thumb-meets-index-finger “fine pinch” grip.
“Nobody had demonstrated that these illusions could have an impact on patients with amputations,” says bioengineer Christian Cipriani, head of the Artificial Hands Area at the BioRobotics Institute of the Scuola Superiore Sant’Anna, Pisa, who was unaffiliated with the study. “The other cool thing is that nobody actually knew that this phenomenon could be induced in nerves that had been reinnervated elsewhere in the arm,” he adds—let alone so clearly, in such distinct hand-positions.
Marasco doesn’t know why his subjects perceived their missing hands to be assuming such complex configurations. But he has a hypothesis: Rather than build each percept from scratch, the brain chunks the hand’s infinite configurations into more manageable building blocks. Then it fine-tunes them to match the sensation of the actual movement. To sense the movement and position associated with grabbing a glass, your brain might cue up a cylinder-grip percept; while lifting a small pile of crumbs from the table, it might recruit the tripod-grip percept. “What we think we’re seeing here, essentially, is a library of building blocks,” Marasco says.
But library access wasn’t good enough. Vibrating a patient’s arm to move their missing hand is a neat trick, but therapeutically useless. To be truly helpful, the researchers needed to combine the perception of hand movement with the intention of hand movement.
To do it, they developed a bidirectional neural-machine interface. Like existing prosthetics, it could relay electrical signals from patients’ reinnervated muscles to a bionic hand. To this one-directional system, the researchers then added a kinesthetic feedback signal; when the hand moved in response to a test subject’s thoughts, it also triggered vibrations at their reinnervation sites, producing the kinesthetic illusion.
When the real-time illusory feedback matched the subjects’ intention, it improved their control within minutes. For instance, when the researchers linked the cylinder-grip signal to the movement of a virtual prosthesis, the feedback enabled patients to control their bionic hands a quarter, half, or three-quarters of the way—without looking at them. Notably, the test subjects with amputations performed the task as well as an able-bodied cohort.
Next, to demonstrate its clinical application, the researchers incorporated the kinesthetic feedback into a prosthetic limb—one they fitted to Kitts.
Strapped on, the limb began to dissolve the boundaries between operation and embodiment—that is, the distinction between controlling an arm, and controlling your arm. “The biggest thing about this study is that it shows you can provide someone a feeling of a hand movement, and a sense of authorship over that movement,” says Todd Kuiken, director of the Neural Engineering Center for Artificial Limbs at the Rehabilitation Institute of Chicago, who was unaffiliated with the study. “Even if it’s a crude open and close, something’s a whole lot better than nothing. And this research shows you can give them something.”
The challenge, he says, will be developing systems that can collect and deliver more nuanced signals. “It’s a real estate problem, because everything’s in the same place—you’re trying to provide feedback to the same skin you’re trying to record signals from,” Kuiken says. “It’s not impossible, but it’s not easy.” Fortunately, he says researchers are already experimenting with embedded electrodes for collecting signals. “It might free up some space for the stuff Marasco is studying now.”
Operating the closed loop system, Kitts says, was transformative. “You know, I did pattern recognition for years—training myself to operate a prosthetic hand—and that was great,” she says. “It was fantastic when I was finally able to move my arm by thinking about it. But this? This is going to take things to an entirely different level.”
The Future of Prosthetics
Want a true bionic limb? Good luck without machine learning.
To feel truly lifelike, prosthetics will require not just kinesthesia, but a sense of touch.
Artificial intelligence is fueling smarter prosthetics than ever before.
Revision 29 Created On March 14, 2018 At 11:13 AM