University of Houston researchers build brain machine interface to control prosthetic hand

A research team from the University of Houston has created an algorithm that allowed a man to use his thoughts to grasp objects with a prosthetic hand. The team demonstrated its technique on a 56-year-old man whose right hand had been amputated.

Researchers captured brain activity through non-invasive monitoring to determine what parts of the brain are involved in grasping an object. With that information, they created a brain-machine interface (BMI) program that harnessed the intentions of the amputee and allowed him to successfully grasp objects, including a water bottle and a credit card. The subject grasped the selected objects 80% of the time using a high-tech bionic hand fitted to his residual limb.

New University of Houston research has demonstrated that an amputee can grasp with a bionic hand, powered only by his thoughts.

Source: University of Houston


The results of the study were recently published in Frontiers in Neuroscience.

According to a press release, this work demonstrates for the first time EEG-based BMI control of a multi-fingered prosthetic hand for grasping by an amputee and could lead to the development of better prostheses. The researchers said the study offers a new understanding of the neuroscience of grasping and will be applicable to rehabilitation for other types of injuries, including stroke and spinal cord injury.

The researchers used a 64-channel active EEG, with electrodes attached to the scalp to capture brain activity to test five able-bodied, right-handed men and women, all in their 20s, as well as the amputee. Brain activity was recorded in multiple areas and was found to occur between 50 milliseconds and 90 milliseconds before the hand began to grasp. This provided evidence that the brain predicted the movement, rather than reflecting it, the researchers reported. The researchers then used the recorded data to create decoders of neural activity into motor signals, which successfully reconstructed the grasping movements.

They then fitted the amputee with a computer-controlled neuroprosthetic hand and told him to observe and imagine himself controlling the hand as it moved and grasped the objects.

The researchers used the EEG data from the amputee as well as the information collected from the able-bodied volunteers to build the algorithm. They believe additional practice, along with refining the algorithm, could increase the success rate to 100%.


Contreras-Vidal JL, et al. Front. Neurosci. 2015; doi:10.3389/fnins.2015.00121.

Disclosure: The authors report the study was funded by the National Science Foundation.

Leave a Reply

Your email address will not be published.