There is no doubt that brain-computer interface technology has the potential to help improve the function and quality of life for people who are unable to use their own arms. This was recently demonstrated by a woman with quadriplegia who shaped the almost human hand of a robot arm with just her thoughts and was able to pick up a ball, a rock, big and small boxes and fat and skinny tubes.


The findings from the University Of Pittsburgh School Of Medicine have been published in the Journal of Neural Engineering. These findings describe the 10-degree brain control of a prosthetic device where the participant used the arm and hand to reach, grasp and place a variety of objects.


"Our project has shown that we can interpret signals from neurons with a simple computer algorithm to generate sophisticated, fluid movements that allow the user to interact with the environment," said senior investigator Jennifer Collinger, Ph.D., assistant professor, Department of Physical Medicine and Rehabilitation (PM&R), Pitt School of Medicine, and research scientist for the VA Pittsburgh Healthcare System.


The trial participant Jan Scheuermann had small electrode grids with 96 tiny contact points each, surgically implanted in the regions of her brain that control right arm and hand movement. The electrode points picked up signals from individual neurons and relayed those signals to a computer to identify the firing patterns associated with observed or imagined movements. The mind-reading technique was used to direct the movements of a prosthetic arm that was developed by John Hopkins Applied Physics Laboratory.


Within just a week of the surgery, Ms. Scheuermann was able to reach in and out, left and right and up and down with the arm to achieve 3D control. Within three months, she was able to flex her wrist back and forth, move it side to side and rotate it both clockwise and counter-clockwise. She was also able to grip objects, adding up to 7D control. She finally mastered 10D control and was able to move the robot hand into different positions while controlling both her arm and wrist.


In order to enable her to reach 10D control, the pincer grip was replaced by four hand shapes: finger abduction, scoop, thumb opposition and a pinch of the thumb, index and middle fingers. She watched animations and imagined the movements.


"Jan used the robot arm to grasp more easily when objects had been displayed during the preceding calibration, which was interesting," said co-investigator Andrew Schwartz, Ph.D., professor of Neurobiology, Pitt School of Medicine. "Overall, our results indicate that highly coordinated, natural movement can be restored to people whose arms and hands are paralysed."


Ms. Scheuermann is thrilled with the results of the study and feels that it has enriched her life and has given her new friends and co-workers.


Source: University of Pittsburgh Schools of the Health Sciences

Image Credit: Extremetech.com

«« Digital Health Ventures Attract $6.5 Billion in 2014


Big Data Tackles Alarm Fatigue »»



Latest Articles

robot arm, brain computer interface technology, prosthetic arm, mind-controlled, calibration There is no doubt that brain-computer interface technology has the potential to help improve the function and quality of life for people who are unable to...