A brain cap that translates thoughts into motion sounds like the stuff of science fiction, but researchers at the University of Maryland are turning the inconceivable into reality. Designed by José ‘Pepe’ L. Contreras-Vidal, an associate professor of kinesiology, the noninvasive, sensor-lined cap could soon harness brain waves to control computers, prosthetic limbs, motorized wheelchairs, and even digital avatars. At present, however, Conteras-Vidal and his team are focusing on helping the paralyzed or disabled extend their range of motion. “We are doing something that few previously thought was possible,” says Conteras-Vidal.


The “Brain cap” fits snugly over the head and uses electroencephalography (better known as “EEG”) to read brain waves. Those readings are analyzed while a person walks on a treadmill or performs some other activity to determine how the brain fires during specific movements. Over the past 18 months, Contreras-Vidal and company have published three papers, including one in the Journal of Neurophysiology, that demonstrates how EEG brain signals can reconstruct the complex movements of the ankle, knee, and hip joints of a person in motion.

Unlike similar technologies, Contreras-Vidal’s cap is noninvasive and requires little training to use.

Unlike other brain-computer interface technologies in development, many of which require electrodes to be implanted directly in the brain, Contreras-Vidal’s cap is noninvasive and requires little training to use. The researches have also received a $1.2 million National Science Foundation grant to work with their counterparts from Rice University, the University of MIchigan, and Drexel University on a thought-controlled prosthetic arm for amputees that allows them to feel whatever it touches—just like a regular one would.

There’s nothing fictional about this,” says Rice University co-principal investigator Marcia O’Malley, an associate professor of mechanical engineering. “The investigators on this grant have already demonstrated that much of this is possible. What remains is to bring all of it— noninvasive neural decoding, direct brain control, and [touch] sensory feedback—together into one device.”


University of Maryland, wearable technology, brains, eco-fashion, sustainable fashion, green fashion, ethical fashion, sustainable style


Brain-cap research could also help victims of stroke relearn how their bodies work. “By decoding the motion of a normal gait,” Contreras-Vidal says, “we can then try and teach stroke victims to think in certain ways and match their own EEG signals with the normal signals.”

Brain-cap research could also help victims of stroke relearn how their bodies work.

Steven Graff, a first-year bioengineering doctoral student, envisions a virtual-reality game that matches EEG data with the characters on screen. “It gives us a way to train someone to think the right thoughts to generate movement from digital avatars,” he says. If they can do that, then they can generate thoughts to move a device.”

Graff has congenital muscular dystrophy and uses a motorized wheelchair. The technology he’s helping bring to fruition could someday allow him the freedom of using both hands—say to dial a phone or put on a jacket—while propelling his chair forward with his mind.

+ Press Release

+ University of Maryland