NATHAN COPELAND WAS 18 YEARS OLD WHEN A CAR ACCIDENT LEFT HIM PARALYZED. He is no longer able to move or feel much of his body, but he still has some movement in his shoulders and some sensation in his wrists and a few fingers. He registered for an experimental study registry while he was in the hospital. He received a call asking, "Would you like to join our study?" around six years ago.
A group at the University of Pittsburgh was looking for volunteers to test the hypothesis that someone might master the operation of a robotic arm by merely thinking about it. This type of study into brain-computer interfaces has been used to investigate everything from giving paralyzed persons the ability to move to New prosthetic limbs being developed, and thoughts being translated into writing. Small electrodes implanted in the brain have the potential to read electrical activity and transfer data to a computer, according to companies like Kernel and Elon Musk's Neuralink. (No, you won't soon be able to download and playback memories.)
Copeland felt happy. He emphasizes that the inclusion standards for research like these are rather strict. You need to be in the appropriate condition, have the correct injury, and even reside close to the right medical center. "I thought from the start: I can do it, I'm capable—so how can I not aid in advancing the science?"
His motor cortex and somatosensory cortex were surgically attached with lentil-sized electrode arrays. They would say
his brain's electrical activity patterns, which indicated his intent to move his wrist and fingers. These impulses would be translated to operate a robotic limb that was mounted on a vertical stand next to him in the lab using a device known as a brain-computer interface (BCI). Copeland started traveling three times a week from his home in Dunbar, Pennsylvania, to Pittsburgh for lab work. After three sessions, he was able to control the robot's movements and grasp just by thinking.
But it was only the start. Copeland was said to be able to feel whatever the robotic hand touched, feeling as though it were his fingers, according to research that was just published in Science. He had mastered hand control during the prior few years.
Observing what it was doing in reaction, he was thinking. However, as soon as the scientists offered him tactile input, he completely kicked ass, tripling his task-performance speed. This BCI for a robotic prosthesis is the first to include touch and motion instructions in real-time. It also makes a significant step toward demonstrating how BCIs may be used to overcome the limitations of paralysis.
Because a user would need real-time tactile feedback from whatever their hand (or the robotic hand) is manipulating to take full advantage of future BCI prosthetics or BCI-stimulated limbs, study author Jennifer Collinger, a biomedical engineer at the University of Pittsburgh, says TOUCH IS IMPORTANT for restoring mobility. Currently, persons who use prosthetic limbs may get past a loss of touch by checking to see if objects are being held by the robotic fingers, but this method is less effective when the object is slick, moving, or just out of reach. Collinger asserts that "you don't necessarily rely on eyesight for a lot of the activities that you do in everyday life. Your sense of touch is what you use to interact with items.
The brain communicates both ways: It receives information while also directing the rest of the body to take action by sending signals. Even something as simple as picking up a cup requires your brain to control your hand muscles and pay attention to the nerves in your fingers.
Theoretically, Copeland's brain could still manage this dialog of inputs and outputs because it had not been damaged in his injury. However, the majority of the electrical signals sent to his brain by his body's nerves weren't getting through. The Pittsburgh team intended to design a workaround when they enlisted his participation in their investigation. They thought that electrical signals from a robotic arm may stimulate a paralyzed person's brain, which would then interpret that stimulation as a sensation.
of having their hand touched. It was difficult to make everything seem natural. When Copeland meant for the robotic wrist to rotate, the hand to shut, and the robotic pinkie to make contact with a hard item, Copeland should feel it in his pinkie.
CURRENT TRENDS
Watch Hackers Destroy a Commercial Robot Arm
MOST COMMON
The Demise of Airbnb in New York
DANIELLE HOOVER
Ski Resorts Have a Lost Interest in Snow Science
TRISTAN KENNEDY Ski Resorts Are Giving Up on Snow
GEAR 40 Amazingly Compulsive Couch Co-Op Games
SIMON HILL 40 Amazingly Compulsive Couch Co-Op Games
'Stuff Your E-Reader' Day: How to Get Free Ebooks
GEAR
'Stuff Your E-Reader' Day: How to Get Free Ebooks
DEE MEDEA GIORDANO
ADVERTISEMENT
Two of the four micro-electrode arrays that were implanted in Copeland's brain activate his sensory system, while the other two read movement intentions from his motor cortex to control the robotic arm. The research team understood from the beginning that they could utilize the BCI to provide Copeland tactile experience by merely applying electrical current to those electrodes—neither physical touching nor robots were necessary.
ADVERTISEMENT
Researchers made use of the fact that Copeland still has some sensation in his right thumb, index, and middle fingers to construct the device. While he was in a magnetic brain scanner, the researchers touched a Q-tip there, and they discovered which precise brain shapes connected to them.
fingers. Then, when he saw particular actions, the researchers recorded brain activity from individual electrodes to understand his intent to move. He felt it when they turned on the electricity to particular electrodes in his sensory system. He believes that the pain originates from the base of his fingers, which are located close to the top of his right hand. He has never felt pain, although it can seem like natural pressure, warmth, or odd sensations. While that was happening, Copeland admits, "I just stared at my hand and thought, 'Man, that feels like someone could be poking right there.'"
After confirming that Copeland was capable of experiencing these feelings and that the researchers were aware of the brain regions to activate to produce feeling, Because it removed Copeland's hesitancy, in part, Gaunt claims that this is what happened. "If you can't feel your hands, you'll grab something and spend a lot of time wrapping around the object trying to fix it," he continues. I've got it in my hands. Yes. I'm certain I put it in. OK. I can now move it and pick it up.
This is a "major win," according to Bolu Ajiboye, a brain engineer at Case Western University who was not involved with the project. He claims that generating genuine sensory signals is like this. It implies that we can at least get close to fully mimicking authentic, natural motions.
And it's crucial, he adds, that there is no discernible lag in the activity. The brain employs delay that occurs as electrical impulses go from the hand to the brain is around 30 milliseconds. But every 20 milliseconds, the robot sends signals to the BCI. According to Ajiboye, this supports one of the most significant functions of touch in this development since it allows the user to perceive the robot hand's movements in real-time. And that sensation registers far more quickly than sight. The processing of sight takes between 100 and 300 milliseconds, making vision the slowest kind of feedback. Try to grasp a slick cup. According to Ajiboye, "you'd drop the cup" if the only way you knew it was slipping out of your grasp was because you could see it.
HIGHEST POPULAR
Airbnb in New York Has Ended
BUSINESS
Airbnb's Demise in New York AMANDA HOOVER
Ski Resorts Have a Lost Interest in Snow Science
TRISTAN KENNEDY Ski Resorts Are Giving Up on Snow
GEAR 40 Amazingly Compulsive Couch Co-Op Games
SIMON HILL 40 Amazingly Compulsive Couch Co-Op Games
'Stuff Your E-Reader' Day: How to Get Free Ebooks
GEAR
'Stuff Your E-Reader' Day: How to Get Free Ebooks
DEE MEDEA GIORDANO
ADVERTISEMENT
The system has several restrictions because it is still a proof of concept and is not yet suitable for usage in homes. The team has given Copeland a scaled-down BCI that he may use to control his computer, but he must come into the lab to operate the robotic arm; he cannot wear it or take it home. I engaged in Copeland explains, "I ended up designing a cat, which I transformed into an NFT because I was using Sega Genesis emulators.
Additionally, a wired connection is necessary. "For me, the threshold would have to be a wireless system," says Rob Wudlick, a project manager at the University of Minnesota's Department of rehabilitation medicine who was paralyzed ten years ago. He is cautiously hopeful about the potential of BCIs, including those that involve robotic arms, to assist individuals in becoming less dependent on caretakers. Controlling a robot to administer water to oneself is a significant achievement, according to Wudlick. The main objective is to regain my freedom.
Copeland's experience doesn't always seem natural, and Gaunt's team is currently examining why and how to best manage clutching.
for fragile items or more challenging jobs, use force. The focus is still on robot arms that resemble humans at this time. However, they want to modify the strategy for either sending signals to and from exoskeletons or stimulating people's limbs into activity, both of which have been tested in other labs. According to Gaunt, "You could imagine using an exoskeleton glove to augment that power if you have a weak grasp—help you open your hand, help you close your hand so you can make a firm enough grasp to hold on to an object without dropping it."
The brain, then, can decipher intentions and redirect orders from any hardware or body that is directed at it. Although it is tethered to the flesh, it is unrestricted by it.