Mind over machine

Scientists are developing ways to translate thoughts to the actions of machines. A lot of scientists—during the annual meeting of the American Association for the Advancement of Science (AAAS) this past week, I could have spent two straight days at symposia, lectures, and demonstrations devoted to robotics, neuroprosthetics, and other aspects of mind and machine alone. While I’d heard of most of the findings and projects before, seeing them in action during a press conference on Friday brought home how far we’ve come.

One approach starts at the nerve endings severed by amputation, rerouting their signals to intact muscles and using sensors on the prostheses to translate the new signal to a familiar movement. “We’re taking the natural ‘move’ signals from the [nerves that once led to the] hand and attaching them to another muscle that can signal the prosthesis,” said Todd Kuiken, director of the Center for Bionic Medicine and Director of Amputee Services at The Rehabilitation Institute of Chicago, who developed the surgical technique. “The muscle becomes the biological amplifier,” he said, offering the user intuitive control of the prosthesis.

Glen Lehman, who lost his hand, forearm, and elbow while serving with the U.S. Army in Iraq, had the rerouting surgery, called targeted muscle reinnervation. At AAAS, Lehman was using a “travel prosthetic” that he’d worked with only a few times, but already he could direct it to make basic movements. “Believing I’m moving my phantom limb and my thumb” moves the thumb of the prosthesis, he said.

(Credit: Center for Bionic Medicine, The Rehabilitation Institute of Chicago)

The Rehabilitation Institute is working with the Walter Reed Army Medical Center in Washington, D.C., and the Sam Houston Military Medical Center in San Antonio to extend its use to more wounded veterans.

Instead of starting at the nerve endings, Andrew Schwartz of the University of Pittsburgh and his team are using an electrode array on the surface of the cortex to signal to a prosthetic arm. The technique has worked in monkeys (using robot arms, not affixed to the monkey), and this summer the team will begin a three-year project testing the system in people who have spinal cord injuries. (See also a Dana Report on Progress story on Schwartz’s monkeys, with photos.)

“Our goal is to re-establish as much capability of the human arm as we can,” said Schwartz, including developing a feedback loop for sensation (see also the Dana story “Engineering the Sense of Touch”). “We know how to do it,” Schwartz said, and now with better-performing sensors and prosthetics, “we have the opportunity.”

Schwartz’s monkeys began to treat their “third arm” as if it were their own, including grooming it. Understanding how the body is represented in the brain, and how we might extend our image of our body to include unconnected or “foreign” objects, is important to helping humans adapt to their artificial limbs, said Olaf Blanke of the Ecole Polytechnique Fédérale de Lausanne in Switzerland. Two of Blanke’s assistants demonstrated his experiment showing that by using virtual-reality goggles researchers can change where people think they are in space (they see themselves a few feet ahead of where they are). In the demo, the goggled subject nearly walked into the first row of spectators. (See also Dana story “Video devices further research into out-of-body experiences” and EPFL video “Using EEG and Virtual Reality to study consciousness”). 

VRview.AAAS.17Feb11.250w

 

VRperson.AAAS.17Feb11.250w

“If you want to restore functionality, patients need to feel the prosthetic device as their own body,” agreed José del R. Millán, also of the Ecole Polytechnique Fédérale de Lausanne. Part of that is the ability to set it on autopilot, lifting a cup of coffee while paying attention to a conversation with a friend, for example.

Virtual keyboards and brain-powered wheelchairs are already available, and a lab in Germany has a prototype system that can give turning commands to a car. But maintaining the concentration to control these devices is taxing; most people are exhausted after only an hour, Millán said. His lab is working on an interface that uses probability theory and sensors to share the work of movement, in its case of a Roomba-like robot driven by a laptop computer.

Roomba.AAAS.17Feb11.250w

The robot interface “senses” the subject’s mental intention and continues that intention until it senses a change; that way, the person doesn’t have to concentrate on maintaining “go straight” or “keep going.” In the press room, a researcher sitting on the platform controlled the machine on the floor with me, and it easily avoided my outstretched foot, (see also the EPFL video “Multitasking with BCI machines”).

Roomba.brain.AAAS.17Feb11.2

“We can determine when the person wants to deliver robot commands, and when he does not, and when he wants to regain control,” Millán said. His test subjects read while “driving” the device. “They can keep mental control while doing other things.”

(Photo Credits: Nicky Penttila)

–Nicky Penttila

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: