email a friend iconprinter friendly iconBionics
Page [ 2 ] of 2
« Prev | 

In October 2006 Kuiken set about rewiring Amanda Kitts. The first step was to salvage major nerves that once went all the way down her arm. “These are the same nerves that work the arm and hand, but we had to create four different muscle areas to lead them to,” Kuiken says. The nerves started in Kitts’s brain, in the motor cortex, which holds a rough map of the body, but they stopped at the end of her stump—the disconnected telephone wires. In an intricate operation, a surgeon rerouted those nerves to different regions of Kitts’s upper-arm muscles. For months the nerves grew, millimeter by millimeter, moving deeper into their new homes.

“At three months I started feeling little tingles and twitches,” says Kitts. “By four months I could actually feel different parts of my hand when I touched my upper arm. I could touch it in different places and feel different fingers.” What she was feeling were parts of the phantom arm that were mapped into her brain, now reconnected to flesh. When Kitts thought about moving those phantom fingers, her real upper-arm muscles contracted.

A month later she was fitted with her first bionic arm, which had electrodes in the cup around the stump to pick up the signals from the muscles. Now the challenge was to convert those signals into commands to move the elbow and hand. A storm of electrical noise was coming from the small region on Kitts’s arm. Somewhere in there was the signal that meant “straighten the elbow” or “turn the wrist.” A microprocessor housed in the prosthesis had to be programmed to fish out the right signal and send it to the right motor.

Finding these signals has been possible because of Kitts’s phantom arm. In a lab at the RIC Blair Lock, a research engineer, fine-tunes the programming. He has Kitts slide off the artificial arm so that he can cover her stump with electrodes. She stands in front of a large flat-panel TV screen that displays a disembodied, flesh-colored arm floating in blue space—a visualization of her phantom. Lock’s electrodes pick up commands from Kitts’s brain radiating down to her stump, and the virtual arm moves.

In a hushed voice, so as not to break her concentration, Lock asks Kitts to turn her hand, palm in. On-screen, the hand turns, palm in. “Now extend your wrist, palm up,” he says. The screen hand moves. “Is that better than last time?” she asks. “Oh yeah. Strong signals.” Kitts laughs. Now Lock asks her to line up her thumb alongside her fingers. The screen hand obliges. Kitts opens her eyes wide. “Wow. I didn’t even know I could do that!” Once the muscle signals associated with a particular movement are identified, the computer in the arm is programmed to look for them and respond by activating the correct motor.

Kitts practiced using her arm one floor below Kuiken’s office in an apartment set up by occupational therapists with everything a newly equipped amputee might ordinarily use. It has a kitchen with a stove, silverware in a drawer, a bed, a closet with hangers, a bathroom, stairs—things people use every day without a second thought but that pose huge obstacles to someone missing a limb. Watching Kitts make a peanut butter sandwich in the kitchen is a startling experience. With her sleeve rolled back to reveal the plastic cup, her motion is fluid. Her live arm holds a slice of bread, her artificial fingers close on a knife, the elbow flexes, and she swipes peanut butter back and forth.

“It wasn’t easy at first,” she says. “I would try to move it, and it wouldn’t always go where I wanted.” But she worked at it, and the more she used the arm, the more lifelike the motions felt. What Kitts would really like now is sensation. That would be a big help in many actions, including one of her favorites—gulping coffee.

“The problem with a paper coffee cup is that my hand will close until it gets a solid grip. But with a paper cup you never get a solid grip,” she says. “That happened at Starbucks once. It kept squeezing until the cup went ‘pop.’”

There’s a good chance she’ll get that sensation, says Kuiken, again thanks to her phantom. In partnership with bioengineers at the Johns Hopkins University Applied Physics Laboratory, RIC has been developing a new prototype for Kitts and other patients that not only has more flexibility—more motors and joints—but also has pressure-sensing pads on the fingertips. The pads are connected to small, piston-like rods that poke into Kitts’s stump. The harder the pressure, the stronger the sensation in her phantom fingers.

“I can feel how hard I’m grabbing,” she says. She can also tell the difference between rubbing something rough, like sandpaper, and smooth, like glass, by how fast the rods vibrate. “I go up to Chicago to experiment with it, and I love it,” she says. “I want them to give it to me already so I can take it home. But it’s a lot more complicated than my take-home arm, so they don’t have it completely reliable yet.”

Eric Schremp, unlike Kitts, doesn’t need artificial hands. He just needs his natural ones to work. They haven’t done that on their own since Schremp broke his neck in 1992, leaving him a quadriplegic. Now, however, the 40-year-old Ohio man can grip a knife or a fork.

He can do this because of an implanted device developed by Hunter Peckham, a biomedical engineer at Case Western Reserve University in Cleveland. “Our goal is to restore hand grasping,” Peckham says. “Hand use is key to independence.”

Schremp’s finger muscles and the nerves that control them still exist, but the signals from his brain have been cut off at the neck. Peckham’s team ran eight micro-thin electrodes from Schremp’s chest under the skin of his right arm, ending at the finger muscles. When a muscle in his chest twitches, it triggers a signal that’s sent via a radio transmitter to a small computer hanging from his wheelchair. The computer interprets the signal and radios it back to a receiver implanted in his chest, where the signal is sent by wires down Schremp’s arm to his hand. There the signal tells his finger muscles to close in a grip—all within a microsecond.

“I can grab a fork and feed myself,” Schremp says. “That means a lot.”

About 250 people have been treated with this technique, which is still experimental. But another bionic device has shown that the marriage of mind and machine can be both powerful and enduring, having been implanted in nearly 200,000 people around the world during the past 30 years. That device is the cochlear implant, and Aiden Kenny is among the latest recipients. Tammy Kenny, his mother, remembers when, a year ago, she learned that her baby was beyond the help of hearing aids.

“I would just hold him in my arms and cry,” she says, “knowing he couldn’t hear me. How would he ever get to know me? One time, my husband banged pots together, hop­ing for a response." Aiden never heard the noise.

He hears banging pots now. In February 2009 surgeons at Johns Hopkins Hospital snaked thin lines with 22 electrodes into each cochlea, the part of the inner ear that normally detects sound vibrations. In Aiden, a microphone picks up sounds and sends signals to the electrodes, which pass them directly to the nerves.

“The day they turned on the implant, a month after surgery, we noticed he responded to sound,” Tammy Kenny says. “He turned at the sound of my voice. That was amazing.” Today, she says, with intensive therapy, he’s picking up language, quickly catching up to his hearing peers.

Bionic eyes may soon follow bionic ears. Jo Ann Lewis lost her sight years ago to retinitis pigmentosa, a degenerative disease that destroys light-detecting cells in the eyes called rods and cones. Lately, however, she has partially regained her vision as a result of research by Mark Humayun, an ophthalmologist at the University of Southern California and a company called Second Sight.

As is common with this disease, part of an inner layer of her retina had survived. This layer, filled with bipolar and ganglion cells, normally gathers signals from outer rods and cones and passes them to fibers that fuse into the optic nerve. No one knew what language the inner retina spoke or how to feed it images it could understand. But in 1992, Humayun began laying, for a short time, a tiny electrode array on the retinas of RP patients undergoing surgery for other reasons.

“We asked them to follow a dot, and they could,” he says. “They could see rows, and they could see columns.” After another decade of testing, Humayun and his colleagues developed a system they dubbed Argus. (Greek mythology. A giant. Hundreds of eyes.) Patients got a pair of dark glasses with a tiny video camera mounted on them, along with a radio transmitter. Video signals were beamed to a computer worn on a belt, translated to electrical impulse patterns understood by ganglion cells, and then beamed to a receiver resting behind the ear. From there a wire took them inside the eye, to a square array of 16 electrodes gently attached to the retinal surface. The impulses triggered the electrodes. The electrodes triggered the cells. Then the brain did the rest, enabling these first patients to see edges and some coarse shapes.

In the fall of 2006 Humayun, Second Sight, and an international team increased the electrodes in the array to 60. Like a camera with more pixels, the new array produced a sharper image. Lewis, from Rockwall, Texas, was among the first to get one. “Now I’m able to see silhouettes of trees again,” she says. “That’s one of the last things I remember seeing naturally. Today I can see limbs sticking out this way and that.”

Pushing the neural prosthetic concept further, researchers are beginning to use it on the brain itself. Scientists behind a project called BrainGate are attempting to wire the motor cortex of completely immobilized patients directly into a computer so that patients can move remote objects with their minds. So far, test subjects have been able to move a cursor around a computer screen. Researchers are even planning to develop an artificial hippocampus, the part of the brain that stores memories, with the intent of implanting it in people with memory loss.

Not everything will work perfectly. One of the four initial BrainGate patients decided to have the plug removed because it interfered with other medical devices. And Jo Ann Lewis says her vision isn’t good enough for her to safely cross a street. Today, however, Kitts has a new, more elastic cup atop her arm that better aligns electrodes with nerves that control the arm.

“It means I can do a lot more with the arm,” she says. “A new one up in Chicago lets me do lots of different hand grasps. I want that. I want to pick up pennies and hammers and toys with my kids.” These are reasonable hopes for a replacement part, Kuiken says. “We are giving people tools. They are better than what previously existed. But they are still crude, like a hammer, compared with the complexity of the human body. They can’t hold a candle to Mother Nature.”

Still, at least the people using the tools can grab the candle. And some can even see it flicker in the dark.

Josh Fischman is a senior editor for research and technology at the Chronicle of Higher Education. Mark Thiessen is a Geographic staff photographer.
Page [ 2 ] of 2
« Prev |