The brain is two-way: it takes information while sending signals to the rest of the body, telling it to take action. Even a movement that seems as simple as grabbing a cup requires your brain to both control the muscles in your hand and Listen to the nerves of your fingers.
Because Copeland’s brain had not been injured in his accident, he could still – in theory – handle this dialogue of entry and exit. But most of the electrical messages from the nerves in his body did not reach the brain. When the Pittsburgh team recruited him for their study, they wanted to design a workaround. They believed that a paralyzed person’s brain could both stimulate a robotic arm and be stimulated by electrical signals from it, ultimately interpreting this stimulation as the sensation of being touched with one’s own hand. The challenge was to make it all natural. The robotic wrist should twist when Copeland intends to twist it; the hand must close when he intends to grab; and when the robotic little finger has touched a hard object, Copeland should feel it in his own little finger.
Of the four arrays of microelectrodes implanted in Copeland’s brain, two grids read the movement intentions of his motor cortex to control the robotic arm, and two grids stimulate his sensory system. From the start, the research team knew they could use the BCI to create a tactile sensation for Copeland simply by supplying electric current to these electrodes – no contact or robotics required.
To build the system, the researchers took advantage of the fact that Copeland retains some sensation in his right thumb, index and middle fingers. The researchers rubbed a cotton swab on it while it was sitting in a magnetic brain scanner, and they found out which specific contours of the brain correspond to those fingers. The researchers then decoded his intentions to move by recording brain activity from individual electrodes as he imagined specific movements. And when they put current to specific electrodes in his sensory system, he felt it. To him, the sensation seems to come from the base of his fingers, near the top of his right palm. It may sound like natural pressure or heat, or a strange tingling, but he never felt pain. “I actually just looked at my hand while it was going like, ‘Dude, we really feel like somebody might sting over there,’” Copeland said.
Once they established that Copeland could feel these sensations, and the researchers knew which areas of the brain to stimulate to create sensations in different parts of his hands, the next step was to simply get Copeland used to controlling the robot’s arm. He and the research team set up a training room at the lab, hanging up Pac Man posters and cat memes. Three days a week, a researcher hooked his scalp’s electrode connector to a string of cables and computers, and then they timed him as he grabbed blocks and spheres, moving them side to side. In a few years he has become very good. He has even demonstrates the system for then President Barack Obama.
But then, says Collinger, “he kind of capped at his high level of performance.” A non-paralyzed person would need approximately five seconds to complete an object movement task. Copeland could sometimes do it in six seconds, but his median time was around 20.
To overcome it, it was time to try and give it real-time tactile feedback from the robot arm.
Human fingers sense pressure, and the resulting electrical signals travel along thread-like axons from the hand to the brain. The team mirrored this sequence by placing sensors on the robot’s fingertips. But objects don’t always touch fingertips, so a more reliable signal had to come from somewhere else: from the torque sensors at the base of the mechanical numbers.