Tapping In: The Promise of Brain-Computer Interface

For decades, neuroscientists have sought to use electronics to communicate with the brain. Computing and surgical technique have now become sophisticated enough to implant devices directly into neural tissue. In this feature, researchers at Albany Medical Center and the Wadsworth Center at the New York State Department of Health reveal a world where mind and machine merge. Their cutting-edge devices translate brain signals into action, helping people with ALS and other disabilities regain the ability to communicate.

For Educators 

Classroom discussion activity for use with the video.


Read this related article.

Wiring the Brain 

When neurologist Anthony Ritaccio enters the operating room, the procedure is well underway: two neurosurgeons have retracted a portion of John's* scalp and removed a section of skull about the size of a playing card. "We're getting close," Ritaccio says, leaning over the pale, pulsing brain to inspect the handiwork. In minutes, the surgeons will carefully slip a plastic sheet of electrodes onto the tissue and stitch the incision around the eight attached wires. The next day, the wires will be linked to an experimental computer system that will, in a way, read John's mind.

Ritaccio and his colleagues at Albany Medical Center in Albany, New York, are assembling a next-generation brain-computer interface, or BCI. By recording John's brain activity as he concentrates on various words, researchers hope to develop a system that will let people communicate without speaking.

"If we can tell from the brain what words a person wants to say to their loved ones, that will be a boon for people who are disabled and otherwise cannot communicate," says Dr. Gerwin Schalk, a collaborator on the project from the Wadsworth Center at the New York State Department of Health. While a few simple, nonsurgical BCI systems are already in the homes of people who need them, the Albany Medical Center operation represents a new frontier of the technology-direct connection with the brain. "As the technology gets better and better, these techniques may actually be useful for people who are less disabled or aren't even disabled at all," says Schalk.

BCI research has snowballed in the past 15 years, expanding from a handful of laboratories to hundreds of projects all over the world. Like all BCI efforts, the Albany team's is founded on the fact that both brains and computers communicate via electricity. Each brain cell, or neuron, transmits electrical pulses to other neurons that can be measured. This transmission generates complex patterns across the neural network that change from second to second.

For decades, scientists have been trying to record these shifting electric signals with electronic devices. Until recently, progress has been slow. "In the 1930's, electrodes were glued to the outside of the scalp," says Ritaccio. "And that's essentially the way most folks are doing it now."

A typical simple BCI system today employs a snug cap embedded with electroencephalography (EEG) sensors, which detect electrical signals that emanate through the scalp. An amplifier boosts the signals, and a computer processes them. In the last two decades, the Albany BCI effort, which brings together neuroscientists, electrical engineers, computer scientists, and clinicians, was among the first to get EEG-based systems out of the lab and into home use. The six current users are almost totally paralyzed because of ALS (amyotrophic lateral sclerosis, or Lou Gehrig's disease). The BCI allows them to send messages to other people using their minds alone. A computer rapidly displays groups of letters, and the user pays attention to the one she wants. The computer looks for the pattern of brain signals that underlie the user's flash of recognition of her chosen letter and then types it.

The users also help the researchers test the systems. Catherine Wolf was a psychologist at IBM researching technology to help people interface with computers when she noticed her first signs of ALS in 1996. Wolf gradually lost control of all her muscles except a few in her face and eyes, and now she relies on an EEG-based BCI to communicate. Wolf can convey only one or two words a minute, and the systems are not always reliable or accurate, so her feedback helps troubleshoot the devices. Still, says Wadsworth's Jonathan Wolpaw, the BCI's chief architect, "if you can restore independence, even with a system that's very simple and may be very slow, it can be very valuable. People can become more productive and lead more enjoyable lives." Within the next year, the team will expand the BCI's reach to 25 people through Veteran's Administration hospitals around the country, and it is now seeking to improve the system with means to move a cursor and access the Web.

One approach to improving the speed and detail of BCI systems is to get closer to the neurons. "Brain electrical activity is very weak and has to sift through many layers," including scalp, bone, and the meningeal membranes, says Ritaccio. "By the time this weak electrical activity bubbles up to the surface, much of it has been lost," he says. Some of the most cutting-edge BCI projects are now using an method called electrocorticography or ECoG, which reads electrical activity directly from the brain surface.

"ECoG recordings are typically used in people with epilepsy or brain tumors to locate the source of seizures," explains Schalk. The Albany BCI team is now using ECoG arrays implanted for epilepsy investigations as prototypes for sensitive BCI's that promise to do much more than EEG-based ones. Patients participate in the BCI study during their one-week postoperative stay, performing experiments that help the researchers develop a system that works.

John is one of those patients. In a hospital room after the surgery, with his head swathed in bandages, John prepares to experiment with Schalk. "The brain produces different types of activity for different types of behavior," says Schalk, as he connects the wires from John's electrode array to a computer. "At the most basic level, we can differentiate [some] behaviors simply by judging where activity changes in the brain," says Schalk. "For example, moving my hand will produce activity changes in [one area], whereas speaking different types of words will produce changes in different areas."

When John speaks the sound fragments that make up words (called phonemes), the computer senses the electrical activity that initiates physical changes in the larynx and other body parts that help him form the word. The activity emerges on the monitor as a pattern of red dots plotted on a map of John's brain. The bigger the dot, the more active the area.

John's brain produces these patterns even if he only imagines the word and doesn't say it aloud. So, after recording the patterns associated with particular words, the ECoG-based system can then reverse the process: analyze a pattern to decipher what word John is thinking. In some cases, the system can even distinguish words such as "set" and "sat."

The ECoG-based system is still in the experimental phase, but it already has far greater resolution and speed than an EEG cap. And this experiment is just the beginning. Beyond helping the disabled, researchers recognize that BCI technology may someday achieve our wildest imaginings, from pilots controlling airplanes with their thoughts to soldiers communicating without spoken words behind enemy lines. "Certainly, this type of research requires some ethical scrutiny," says Ritaccio. "But these are the kind of things that are not too far out in the distant future."

*Not his real name.