Skip to Content

Sweet Sensation

December 18, 2018

When graduate student Luke Osborn needed to test the fingertip sensors he’d spent years developing for prosthesis wearers, he didn’t have far to look. The ensuing collaboration and results hold big promise for amputees.

In a Clark Hall lab one day in the fall of 2016, Luke Osborn, MS ’14, PhD ’18, attached two tiny beryllium copper probes to the left arm of his fellow graduate student György Lévay, MS ’17.

Lévay had lost most of that arm—along with his right hand and both feet—six years earlier, when a severe systemic infection turned his extremities necrotic. Now, he and Osborn were both students in the lab of Nitish Thakor, a professor of biomedical engineering and one of the world’s foremost innovators in prosthetic devices.

Osborn’s copper probes were part of an audacious project—one that he and a group of fellow graduate students had been working on for three years. The goal: to give amputees the ability to feel sensations in the fingertips of their prosthetic hands.

Under Thakor’s guidance, Osborn had meticulously developed fingertip sensors that mimic the architecture of the neurons in human skin. He had tested the sensors on robotic benchtop prosthetic hands for months, training them to respond to various stimuli.

The study—which was published in Science Robotics last June—marks a breakthrough in providing sensory inputs for prosthesis users. For Osborn, it was the culmination of five years of work.

“Part of the beauty of the field of biomedical engineering,” he says, “is that it’s getting stronger on both the medical and the engineering sides. This is exactly the kind of work that I hoped to do when I came to Hopkins.”

Thakor calls Osborn’s project one of the most impressive he has ever supervised. “Luke built the sensors, wrote the algorithms, and designed the experiment,” he says. “He had the experience and the competence to do all of this, and he put it all toward solving an exciting problem. It came together beautifully in the end.”

Three students and a professor work in the lab on a prosthetic arm.
Luke Osborn, center, and Nitish Thakor, right, in the lab, where testing of the prosthesis began in 2016. (Image: Will Kirk / Homewood Photography)

A HARD NUT TO CRACK

Osborn arrived in Thakor’s lab in the fall of 2012. As an undergraduate at the University of Arkansas, Osborn had primarily been interested in pure robotics. But by the time he started graduate school, he wanted to do work that had medical applications. Thakor’s lab seemed like the perfect fit. For more than 25 years, Thakor has worked on developing better methods for controlling prosthetic limbs, both at Johns Hopkins and at a companion lab in Singapore.

Osborn quickly turned his attention to the problem of sensation. There have been many improvements in control systems for prosthetic devices during the last decade, but few attempts have been made to allow amputees to feel touch signals from their prosthetic limbs.

“Touch is really interesting,” says Paul Marasco, a sensory neurophysiologist who works on bionic prosthetic devices at Cleveland Clinic. (He was not involved in Osborn’s project.) “The individual touch sensors in the skin don’t really provide you with a cohesive perception of touch. The brain has to take all that information and put it all together and make sense of it. The brain says, ‘Well, OK, if I have a little bit of this and a little bit of that, and there’s a little bit of this sprinkled in, then that probably means that this object is slippery. Or this object is made out of metal, or it feels soft.’ None of those perceptions map simply onto a single type of sensory neuron. All of them require the brain to integrate data from several different types of neurons. That’s one reason why sensation has been such a hard nut to crack and why there are so few labs doing it.”

Osborn was determined to try. He began his project in 2013 by looking for sensor materials that would be flexible enough to fit smoothly over the fingertips of a prosthesis but tough enough to withstand repeated contact with diverse objects. After several rounds of trial and error, he developed a rubber fabric that encases two layers of small silicone-based piezoresistive sensors. The team calls this fabric the “e-dermis.”

The two layers of the e-dermis mimic the two primary layers of human skin: the epidermis and the dermis. On the outer, “epidermal” layer, Osborn’s team designed certain sensors to behave like human nociceptors, which detect noxious, painful stimuli. In the deeper, “dermal” layer, the sensors mirror four types of neurons known as mechanoreceptors, which variously detect light touch and sustained pressure.

“It’s a pattern that’s biomimetic—a sensor array that matches what our nerve endings are used to,” Thakor says. “Luke’s team made a meticulous effort here to get the patterns right.”

As he developed the fingertip sensors, Osborn initially performed benchtop experiments using a prosthetic hand that was not attached to a human user. In these purely robotic trials, he developed two reflex responses that mimic human spinal reflexes. First, he trained the hand to tighten its grip if the fingertip sensors told it that an object was slipping. Second, he trained the hand to automatically drop a painful object.

The challenge here was speed: Human spinal reflexes operate within 100 to 200 milliseconds—think of how fast you react to a hot stove—and Osborn’s team wanted to match that rate. To accomplish that, the prosthetic hand had to correctly determine within just 70 milliseconds that it was grasping something painful.

“We were able to achieve that quick decision by looking at a few key features of the pressure signal from the e-dermis,” Osborn says. “These features include information such as where the pressure is located, how large the pressure is, and how quickly the pressure changes.”

AN HOUR HERE, AN HOUR THERE

Gyorgy Levay
György Lévay, at Johns Hopkins on a Fulbright, was happy to serve as a test subject for Luke Osborn’s experiments.

By late 2016, with his benchtop studies complete, Osborn was ready to begin testing the e-dermis on human participants. He turned first to fellow grad student Lévay, who had arrived at Johns Hopkins in 2015 on a Fulbright scholarship. As a master’s degree student in biomedical engineering, Lévay worked on pattern recognition systems that give prosthesis users better control over their limbs’ movements. (Lévay is just one of several prosthetic limb users who have studied in Thakor’s lab over the years.)

Osborn asked Lévay if he might be willing to be a test subject for his study of painful stimuli. Lévay said he was absolutely game—particularly since Osborn wasn’t planning to implant electrodes in Lévay’s skin, an approach that some labs have used with other prosthesis users.

Lévay volunteered dozens of hours of his time—an hour here, an hour there—during the final semester of his own master’s degree program.

The first step was an extended period of sensory mapping. Osborn needed to discover exactly the right locations to place the probes on Lévay’s residual limb. At most locations, Lévay simply felt electrical irritation or stinging on the residual limb itself and didn’t perceive any sensations from his prosthetic hand. But at a few sweet spots, which Osborn discovered through many hours of trial and error, Lévay’s residual nerves could perceive electrical stimulations from the phantom hand only in the phantom hand itself.

This is possible, Osborn explains, because the electrical signals he uses are very gentle. “The current that we’re using is small enough that it wouldn’t typically be perceived by the surface of the skin at the site of stimulation”—that is, the point where the stimulation is attached to Lévay’s residual arm, he says. “But some of the nerves underneath the skin do detect the signal, and they interpret it as a signal from the hands that they’re going to send upstream to the brain.”

Once the sensory mapping was complete, Osborn’s team was able to start working on the heart of the study. As Lévay’s prosthetic hand grasped smooth and pointed objects, Osborn adjusted the programming of the system, assessing how and where Lévay was perceiving pain sensations. (The desk was set up so that Lévay couldn’t see what his prosthetic hand was doing. He didn’t have any visual cues about whether he was grasping smooth objects or sharp ones.)

Luke Osborn works on the prosthetic arm.
For Osborn, five years of work have culminated in artificial fingertips that allow prosthesis users like Lévay to discern sharp from smooth objects and to feel pain.

Osborn could adjust three primary variables: frequency, amplitude, and pulse width. The goal was to create a “neuromorphic” signal that mirrors the complexity of our perception systems.

By the end of the study, Lévay says, he was able to perceive a wide array of touch sensations in his phantom hand. “Some of them were like someone was pressing on my hand or like a pulsating of blood. Some of them were very interesting stuff.”

A close-up shot shows the e-dermis on the finger of a prosthetic arm.
Osborn and team developed an “e-dermis” for the artificial fingertips, which mimic the two primary layers of human skin.

Over the course of more than 150 trials, Osborn developed a complex algorithm that gave Lévay a reasonably accurate set of pain perceptions from the prosthetic device. The locations of the pain perceptions were never as pinpoint-specific as an intact person would have experienced—nor were they ever expected to be. But Lévay could correctly report whether the pain was occurring along the median nerve (the region of the thumb and index finger) or the ulnar nerve (the pinkie). Electroencephalogram studies confirmed that the signals were activating regions of Lévay’s brain that corresponded to the phantom hand.

Throughout the project, Osborn checked in with Thakor at weekly research meetings. The team also included Andrei Dragomir, a senior research fellow at the National University of Singapore; Whiting School doctoral students Joseph Betthauser ’14 and Christopher Hunt ’14; and Harrison Nguyen ’18, who helped design and test the final iterations of the fingertip sensors.

As the youngest member of the team, Nguyen says that he had a valuable experience working with Osborn and Thakor. “Depending on where you are in your training, Luke can be very supportive and hands-on,” he says. “And once you’re more capable, he’s glad to be more hands-off. He’s always willing to talk through problems in the lab.”

WAVE OF THE FUTURE

You might imagine that the dozens of hours they spent sitting together in the lab would have allowed Osborn and Lévay to talk shop and to exchange ideas about their mutual interest in improving prosthetic devices. But it wasn’t quite like that: To maintain the integrity of the experiment, it was crucial for Lévay to be blinded to many of the questions Osborn was trying to answer. When Lévay described what a stimulus felt like, Osborn wanted his description to be based purely on what he was feeling, not biased by any knowledge of how Osborn was programming the system.

“Luke went to surprisingly painstaking lengths to make sure I didn’t know what he was up to,” Lévay says. “For instance, the stimulator had a little red light on it that blinked every time a stimulation was sent. So if I’d really watched it, I could have deduced the frequency of the stimulation. Luke taped it off so that I couldn’t see it. His screens were always hidden away, and I could only look in a certain direction. So, yeah, it was hard, because I was really interested in what was happening.”

This was particularly frustrating, Lévay says, “when there were sensations that I liked a lot. I would be like, ‘What were these?’ and Luke would say, ‘I can’t tell you.’ This lasted for more than a year while I knew basically nothing about what was happening. Of course, we were working in the same lab, so it was that much more difficult. We made sure that we worked on opposite sides of the lab so that I wouldn’t overhear anything accidentally.”

Since completing their work with Lévay, Osborn’s team has tested sensory perceptions with several other amputees in the Clark Hall lab. To varying degrees, they have all been able to perceive accurate sensory signals from their phantom limbs. One question going forward will be how much the nature of the initial injury affects prosthetic sensory systems. A person who loses a limb in a military conflict, for example, might have different kinds of nerve damage in the residual limb than a person who loses a limb from septicemia or from a motor vehicle accident.

“Some of the crucial factors,” Lévay says, “are the level of skin degradation that occurred. Is the skin that remains on the individual sensitive? Is it well-vascularized? Did the nerves grow back into the muscles?”

Osborn, who completed his PhD last summer, hopes to continue working on prosthetic technologies throughout his career. “Luke’s work on sensory input is absolutely the way of the future,” says Rahul Kaliki, the CEO of Infinite Biomedical Technologies, a prosthetics-centered firm that spun off from Thakor’s lab in 1997 in partnership with his former student and co-founder, Ananth Natarajan MSE ’98. “Sensory feedback is one of the crucial things that has been missing from prosthetic limbs.”

A drawing of the layers of the e-dermis
The multilayered e-dermis is made up of conductive and piezoresistive textiles encased in rubber. A dermal layer of two piezoresistive sensing elements is separated from the epidermal layer, which has one piezoresistive sensing element, with a 1-mm layer of silicone rubber. The e-dermis was fabricated to fit over the fingertips of a prosthetic hand.

Students in Thakor’s Johns Hopkins lab are working on a wide variety of strategies for improving prosthetic devices. In partnership with Infinite Biomedical Technologies, they are developing high- density electrodes for sensing muscle signals and radio-frequency identification systems that allow prostheses to recognize tagged objects—like the user’s personal coffee cup—and to automatically prepare to grasp them. The lab was recently awarded a major grant from the National Science Foundation to develop sensors and algorithms for discrimination of texture and shape.

“Lots of robotics labs have developed sensors in the last few years,” Thakor says. “And in their proposals, they always say, ‘This could have applications for prosthetics.’ But they almost never actually do the work to make the sensors useful for amputees. That’s one reason I’m so pleased with what Luke has done.”

Osborn, for his part, says he is grateful for the many hours volunteered by Lévay and the other participants in his studies. “None of what we do would be possible without the interest, dedication, and willingness of participants to come and work with us,” he says.

Today, back in his native Hungary, Lévay works remotely as a research director for Infinite Biomedical Technologies. He is continuing to refine his pattern recognition systems for improving users’ control of prosthetic arms.

He knows from personal experience how high the stakes are. “For people who have lost limbs, the expectation is very high,” he says. “If someone gets a prosthesis, what they want is what they lost. And we’re quite a distance away from that. People are not happy at all with the products we have. But that’s what’s prompting further development and research—and results like what Luke has achieved.”

WEB EXTRA

– David Glenn // Illustrations by Mark Allen Miller

Read the Johns Hopkins University privacy statement here.

Accept