February 14, 2019

Trayanova selected to receive HRS Distinguished Scientist Award

Natalia Trayanova, the Murray B. Sachs Professor of Biomedical Engineering at Johns Hopkins University, has been selected to receive the Heart Rhythm Society’s (HRS) Distinguished Scientist Award. Trayanova will be honored at the Society’s 40th Scientific Sessions in San Francisco this May.

The HRS, a leading resource on cardiac pacing and electrophysiology, is dedicated to improving the care of patients by promoting research, education, and optimal health care policies and standards. They represent specialty organizations and medical, allied health, and science professionals from more than 70 countries who specialize in cardiac rhythm disorders. The HRS’s Distinguished Scientist Award recognizes individuals who have made major contributions to the advancement of scientific knowledge in the field of cardiac pacing or cardiac electrophysiology.

Trayanova has pioneered the use of 3-D virtual models of the heart and its electrical function to improve patient care for individuals who suffer from ventricular or atrial fibrillation, two types of irregular heartbeats. By personalizing her models with data from individual patients, Trayanova is developing innovative methods to predict the risk of sudden death or stroke from these conditions, as well as determine optimal treatments. Last fall, Trayanova partnered with Hugh Calkins, professor of cardiology at the Johns Hopkins School of Medicine, to launch the Alliance for Cardiovascular Diagnostic and Treatment Innovation, or ADVANCE, a research effort that aims to improve the diagnosis and treatment of heart rhythm disorders by leveraging innovations in cardiac imaging, computer simulations, and data science.

“I feel very honored by this award,” said Trayanova. “It represents a recognition of a biomedical engineering contribution to clinical cardiac electrophysiology, an acknowledgement of the promise that engineering holds for the future of cardiac patient care.”

February 6, 2019

New computer program reduces spine surgery errors linked to “wrong level” labeling

Researchers at Johns Hopkins Medicine report that a computer program they designed may help surgeons identify and label spinal segments during real time operating room procedures and avoid the costly and potentially debilitating consequences of operating on the wrong segment.

The current study builds on previously described work—published in April 2015 and March 2016—on the algorithm dubbed LevelCheck, which was designed and developed by Jeffrey Siewerdsen, professor of biomedical engineering, computer science and radiology at the Johns Hopkins University School of Medicine and founder of the school’s Imaging for Surgery, Therapy and Radiology Laboratory. Details of the current findings were published last fall in the Annals of Biomedical Engineering.

“Operating on the wrong part of a spine is rare, but even once is too much for a patient and a surgeon,” says Amir Manbachi, a research associate in Siewerdsen’s laboratory when the current research was completed and now first author on the study and an assistant research professor of biomedical engineering at the Johns Hopkins University. “LevelCheck is designed to help make such errors ‘never’ events.”

The researchers say current estimates indicate that spinal surgeons operate on the wrong spinal segment only about once in every 3,100 surgeries. Consequences, however, are huge, potentially leading to paralysis, more surgeries and huge increases in health care costs.

Most humans have the same number of spinal segments, which are labeled L1, L2 and so on. Currently, surgeons identify the correct target spinal segment, or “level,” by using X-rays of the patient taken in the operating room at the time of the surgery, and counting up or down the spinal segments on the X-rays to identify and verify the correct one.

These intraoperative X-rays sometimes can be difficult to read on the spot due to poor image quality, the patient’s position or weight, or atypical spinal anatomy. All of these issues potentially lead to surgeon error in identifying the correct spinal level on which to operate for such conditions as herniated discs.

Some surgeons also physically mark the correct spinal segment with a metal marker or surgical cement during a preliminary procedure, but with this approach, patients face additional surgical risks.

The LevelCheck program uses a patient’s MRI or CT scan images taken before the operation. By feeding the imaging data into the Level Check computer program, engineers use mathematical algorithms to compare anatomical landmarks, line them up, and transfer the digital labels of each spinal segment from the preoperative scan to the digital X-ray taken in the operating room.

The LevelCheck-verified spine segments are then presented to the surgeon to inform assessment of the correct spinal segment for surgery.

For the current research, the scientists set up a mock operating room. They selected 62 of 364 past spinal surgeries performed at The Johns Hopkins Hospital between 2012 and 2016 for surgeries involving long segments of the spine, specifically choosing X-ray images that were the most difficult to read and label.

A neuroradiologist previously and correctly labeled all of the X-rays to determine where the correct surgical sites were on the images.

The researchers then asked five surgeons to label the same X-rays in two ways: with LevelCheck assistance at the same time they labeled the segments and to confirm their labeling after marking the segments without the program’s assistance.

They also randomly presented some of the same cases to the surgeons multiple times to account for fatigue or waning attention.

Without LevelCheck assistance and in the difficult cases presented to them in the mock setup, the surgeons labeled the target spinal segment in these challenging cases incorrectly a median of 14 cases out of 46 trials.

However, when the surgeons used LevelCheck either before or after labeling the segments, the average error rate dropped to a median one case out of 46 trials.

Next, the researchers tested LevelCheck’s labeling during 20 real time operations at The Johns Hopkins Hospital after surgeons had labeled the segments without the aid of LevelCheck. While both the surgeons’ initial labeling and LevelCheck’s results were correct in all 20 operations, which were not selected for difficult cases, the goal was to determine if they could integrate LevelCheck into real-world operations of the surgical workflow.

The scientists found that it took an average of 17 to 72 seconds for LevelCheck to deliver its labeling results, close to the median 20- to 60-second range surgeons when surveyed said they were willing to wait for the results.

“A surgeon may say, ‘I don’t need this, I always get it right,’” says Siewerdsen, senior author of the study. “This algorithm actually improves surgeons’ rates of getting it right.”

Before and after each of the 62 mock operating room cases, the researchers gave questionnaires to the five surgeons, including the repeated cases, for a total of 410 questionnaires. The researchers found that LevelCheck improved the surgeons’ confidence in labeling 91 percent of the time (373 out of 410 times).  Another 5.8 percent (24 out of 410) of the time, surgeons said it didn’t have an impact on their confidence, while 3 percent of the time (13 out of 410) surgeons reported feeling the program reduced their confidence.

In the 20 cases in the real-time operating room setting, the surgeons said LevelCheck improved their confidence in 16 of the 20 cases and had no impact in the remaining four cases.

Although the researchers say they have not determined the cost of LevelCheck at this stage of development, they say it requires a computer with a graphics card and, at this point, an engineer to operate the software. They hope to further automate the system so that surgeons can use it without an engineer present. The researchers aim to conduct more trials of the program at other institutions.

January 29, 2019

Hopkins BME student awarded AHA predoctoral fellowship

Chen Zhao, a biomedical engineering Ph.D. candidate at Johns Hopkins University, has been awarded a two-year American Heart Association predoctoral fellowship, which provides funding to students pursuing research related to cardiovascular function and disease.

Working in the lab of Aleksander Popel, professor of biomedical engineering, Zhao is building multi-scale computational models to understand the role of macrophages, a type of white blood cell, in the regulation of blood vessel formation and inflammation in peripheral arterial disease (PAD). He plans to use these models to identify novel therapies with the potential to improve blood circulation in patients with PAD.

“This is an exciting acknowledgement of my research in translational systems biology,” said Zhao. “The application and writing process is really a valuable training experience for me as a student and young investigator.”

Submit your clinical and research needs to BME Design Team

The Johns Hopkins Department of Biomedical Engineering is soliciting clinical needs for its undergraduate Design Team program. Faculty, researchers, industry professionals, and healthcare providers who are interested in harnessing the collective skill, energy, and talent of the BME student design teams have until February 13 to submit a design challenge for the next round of projects. If selected, a team of five to eight undergraduate biomedical engineering students will work with a group of medtech design and commercialization experts over the course of a year to design and test a solution.

Each year, approximately 15 teams of undergraduate students work with clinical sponsors and faculty mentors, as well as professional designers, engineers, and scientists, to define and implement an impactful project addressing a clinical or research need. Now in its 21st year, the Design Team program offers students the opportunity to create solutions to a variety of clinical challenges. This teamwork has been the basis for publications, patents, products, licensing agreements, and start-up companies.

Design teams will select their projects for the 2019-20 academic year starting this spring. They will then work toward a design solution over the next 18 months, supported by engineering, clinical, and professional mentors over the next 18 months. Throughout the course of their projects, teams will have access to the Department of Biomedical Engineering’s state-of-the-art Design Studio, providing the equipment and resources they need to build and test early-stage design solutions.

Learn more about the submission process or the benefits of sponsoring a BME Design Team project here.

Submit Your Clinical Challenge

January 16, 2019

Study defines differences among brain neurons that coincide with psychiatric conditions

It’s no surprise to scientists that variety is the very essence of biology, not just the seasoning, but most previous studies of key brain cells have found little variability in a common cell process that involves how genetic information is read and acted on.

The process, called epigenetics, involves chemical or structural “tweaks” to gene activity that don’t affect the underlying genetic code itself, but do affect when and how a gene becomes available to be read for its protein-encoding instructions. When epigenetic changes strike at the wrong time or place, the process turns genes on or off at the wrong time and place, too.

Now, in a new study focusing on four regions of normal human brain tissue, Johns Hopkins scientists have found about 13,000 regions of epigenetic differences between neurons in different brain regions that vary by at least 10 percent. Using whole genome sequencing and computational statistical tools, they also found that the location of those epigenetic changes — covering about 12 million bases in the genome — co-locate with the genetic signal contributing to addictive behavior, schizophrenia and neuroses such as biopolar disorder.

“We believe we have figured out what parts of the neuronal genome are epigenetically different among these four brain regions,” says Andrew Feinberg, M.D., the Bloomberg Distinguished Professor of Medicine, Oncology and Molecular Biology and Genetics. “And these areas are enriched with inherited genetic variants linked to certain psychiatric conditions.”

Scientists have long suspected that epigenetics plays a significant role in psychiatric conditions, other neurologic diseases such as Alzheimer’s, and a long list of other human ailments, including cancer. The current study does not definitively prove an epigenetics link to psychiatric conditions, but provides a road map to further study epigenetic diversity in the gene locations identified by the Johns Hopkins team, Feinberg says.

“We do know that both epigenetic and genetic changes contribute to the problem of cells not doing what they’re supposed to do,” adds Feinberg, who has studied epigenetics for decades. Results of the study are described online Jan. 14 in Nature Neuroscience.

Biostatistician Kasper Hansen, Ph.D., who co-led the study with Feinberg, says one of the main differences between their study and previous attempts to look at epigenetic diversity is that the Johns Hopkins scientists used a strong experimental design focused on different cell populations, including neurons. Other studies did not separate neurons from brain glial cells, which support neurons, acting as scaffolding, cleaners and nutrient suppliers.

The Johns Hopkins scientists, including first authors Lindsay Rizzardi and Peter Hickey, began their research with 45 brain tissue samples taken from six people (three males and three females, ages 37–57) who were not diagnosed with psychiatric or neurologic conditions and, upon their death, had donated their brains to biobanks at the National Institutes of Health and University of Maryland.

The samples were taken from four regions of the brain: the dorsolateral prefrontal cortex, which controls decision-making and social behaviors; the anterior cingulate gyrus, known for its link to emotions and behavior; the hippocampus, which is responsible for learning and memory; and the nucleus accumbens, the site for processing reward behavior. By comparing samples from the same individual across different brain regions and cell populations, it is possible to rule out the confounding effect of genetics and many environmental exposures, such as smoking, says Hansen.

The scientists purified the brain tissue samples to isolate neurons and glia, sequenced the neurons’ genome and compared the sequencing results of neurons in each brain region. Looking at the distribution of epigenetic changes across the genome, the scientists found more epigenetic diversity in 12 million base pairs (out of 3 billion) of the genome than what would normally occur in those regions by chance alone. They found that most of the differences in epigenetics occurred in neurons of the nucleus accumbens, the brain’s reward center.

Using statistical tools to evaluate the genomic sequencing results, the researchers found that at least one of eight types of epigenetic changes was positively correlated with known genetic code changes among nearly half (13 of 27) of traits linked to heritable forms of addictive behavior, schizophrenia and neuroticism. Epigenetic changes were not linked to genetic differences among heritable, non-brain-related traits such as body mass index and height.

Hansen, who is an associate professor of biostatistics at the Johns Hopkins Bloomberg School of Public Health and McKusick Nathans Institute of Genetic Medicine, explains that the strong experimental design helps eliminate differences between individuals by comparing multiple samples from different brain regions from the same individual. “Furthermore, the strength of the genetic association is also determined by existing results on the genetic architecture of these traits, which have been established from tens to hundreds of thousands of samples,” says Hansen.

“Epigenetic changes may alter cells’ identity as well as their function,” suggests Feinberg, who also is a professor of biomedical engineering, biostatistics and psychiatry and behavioral science at Johns Hopkins. “To reveal how epigenetics is linked to psychiatric conditions, the next step is to develop customized genomic arrays that capture the areas of the genome that we identified and compare them to more samples of people with and without psychiatric disease.”

The Johns Hopkins team of researchers also includes Varenka Rodriguez DiBlasi, Rakel Tryggvadóttir, Colin M. Callahan and Adrian Idrizi.

January 11, 2019

Five BME students receive Provost Undergraduate Research Award

Five undergraduate students studying biomedical engineering at Johns Hopkins have been selected to receive the Provost Undergraduate Research Award (PURA).

Established in 1993, PURA supports undergraduates as they pursue independent research, design, entrepreneurship, scholarly, and creative projects over the academic year.  With the help of a $3,000 fellowship, PURA recipients conceive and drive projects under the guidance of a mentor from any Hopkins division, center, or institute. Students will present their projects this spring at DREAMS, an annual celebration of undergraduate research at the Homewood campus.

The five students from the Department of Biomedical Engineering are:

Shubhayu Bhattacharyay (Mentor: Robert Stevens)
Project: A feature-based approach to quantify motor activity in critically ill neurological patients using an unobtrusive wearable sensor matrix

Jialiu (Annie) Liang (Mentor: Natalia Trayanova)
Project: Parameters influencing arrhythmia inducibility in the post-myocardial infarcted (MI) heart arising from treatment with PSC-CM cell sheets

Dante Navarro (Mentor: Peter Searson)
Project: Using wearable technology to measure change in functional capacity following interventional pain procedures

Shivani Pandey (Mentor: Geran Kostecki)
Project: Development of an optogenetic system for point-pacing cardiomyocytes in vitro

Yueqi (Bill) Zhang (Mentor: Nicola Heller)
Project: Monocytes as cellular immunotherapy for allergic asthma

Pathology’s Digital Future

David West ’16 has long been an entrepreneur. As early as seventh grade, West and his friends in the Philadelphia suburb where he grew up started a business converting VHS videos to DVDs.

“Someone actually trusted me to handle their wedding video as a middle schooler,” West remembers. “I caught the bug for business at that point.”

West applied early decision to Johns Hopkins’ biomedical engineering program to marry his interests in technology and medicine. Once at the Whiting School, he soon became involved in the Kairos Society, a global organization of college entrepreneurs. While juggling his academic load over his second year, West worked with members of the group to form their own company, Karcinex, which developed a device that increased the sensitivity of urine cytology tests for bladder cancer.

That company served as a starting point for his next venture. Together with childhood friend Coleman Stavish and fellow Johns Hopkins engineering student Nathan Buchbinder ’15, he formed a new company they named Proscia in 2014. West signed the closing documents for the company’s first round of financing during his Whiting School graduation ceremony in 2016.

Proscia was inspired by the many talks that West had with Johns Hopkins urologists while launching Karcinex. Several of these mentors told him that digital pathology—moving the images of tissue samples gathered to diagnose disease onto computers —would be the wave of the future. Such technology would not only allow experts to share results and gather opinions but could also eventually take human bias out of pathology by allowing artificial intelligence algorithms to analyze results.

Proscia aims to do both. Since the company’s launch, it has amassed thousands of physician users across 130 major hospitals and labs. Using Proscia’s tools, physicians can gain access to these images either on the cloud or on their institution’s electronic medical record system, allowing shared access among multiple individuals at the same institution and beyond.

The company has also developed additional tools to automatically analyze images for characteristics important to a variety of diseases.

“We call it digital pathology now, but this is shifting to the standard of care,” West says. “Eventually, this will just be called pathology, and our company will be leading the way.”

– Christen Brownlee

January 7, 2019

How the brain decides whether to hold ’em or fold ’em

Picture yourself at a Las Vegas poker table, holding a bad hand – one with a very low chance of winning. Even so, the sight of the large stack of chips that piled up during a recent lucky streak nudges you to place a large bet anyway.

Why do people make high-risk choices – not only in casinos, but also in other aspects of their lives – even when they know the odds are stacked against them?

A team led by a Johns Hopkins biomedical engineer has found that the decision to “up the ante” even in the face of long odds is the result of an internal bias that adds up over time and involves a “push-pull” dynamic between the brain’s two hemispheres.

Whether you are suffering from a losing streak or riding a wave of wins, your cumulative feelings from each preceding hand all contribute to this nudge factor, they say. A paper on the study is to be published online the week of Jan. 7 by the journal Proceedings of the National Academy of Sciences.

Insights from the research have the potential to shed light on how soldiers in high-risk combat situations make decisions and to facilitate more effective brain training to change or “rewire” long-term behavior or habits, the researchers suggest.

“What we learned is that there is a bias that develops over time that may make people view risk differently,” said senior author Sridevi Sarma, a biomedical engineering professor at the Johns Hopkins University Whiting School of Engineering and member of its Institute for Computational Medicine. Pierre Sacré, a postdoctoral fellow at Johns Hopkins, co-led the study.

Sarma’s group sought to understand why people tend to take risks even when the odds are against them or avoid risk even when the odds are favorable. They also wanted to learn where in the human brain such behavior originates. They asked patients at the Cleveland Clinic’s Epilepsy Monitoring Unit to play a simple card game involving risk taking.

The patients had undergone stereoelectroencephalography, a procedure in which doctors implanted multiple deep-seated electrodes in their brains; that was designed to allow the doctors to locate the source of seizures for future surgical treatment. Each of these depth electrodes has 10 to 16 channels that record voltage signals from the neurons surrounding it. The electrodes also allowed Sarma and her team an intimate look at the patients’ brains in real time, as they made decisions while gambling against a computer in a card game.

The game was simple: The computer had an infinite deck of cards with only five different values, 2, 4, 6, 8, and 10. Each value card was equally likely to be dealt in any round. Following every round, the cards went back into the deck, leaving odds unchanged.

Participants were shown two cards on a computer screen, one faceup and the other facedown. (The faceup card was the player’s, and the facedown card was the computer’s.) Participants were asked to bet low ($5) or high ($20) that their card had a higher value than the computer’s facedown one.

When dealt a 2, 4, 8, or 10, participants bet quickly and instinctively, the research team found. When dealt a 6, however, they wavered and were nudged into betting higher or lower depending on their bias – even though the chances of picking a higher or lower card were the same as before. In other words, participants’ betting behavior was based on how they fared on past bets even though those results had no bearing on the outcome of the new bets.

On examining neural signals recorded during all four stages of the game, Sarma’s team found a predominance of high-frequency gamma brain waves. They were even able to localize these signals to particular structures in the brain. It turns out that these regions – excluding any implicated in drug-resistant epilepsy – were associated positively or negatively with risk-taking behavior.

“When your right brain has high-frequency activity and you get a gamble, you’re pushed to take more of a risk,” said Sacré, who expressed surprise at the symmetry of the patients’ brain reactions under these conditions. “But if the left side has high-frequency activity, it’s pulling you away from taking a risk. We call this a push-pull system.”

To assess that internal bias, the researchers developed a mathematical equation that successfully calculated each patient’s bias using only their past wagers.

“We found that if you actually solve for what this looks like over time, the players are accumulating all the past card values and all the past outcomes, but with a fading memory,” Sarma says. “In other words, what happened most recently weighs on a person more than older events do. This means that based on the history of a participant’s bets, we can predict how that person is feeling as they gamble.”

Study co-authors included Ernst Niebur, Kevin Kahn, Matthew S. D. Kerr, and Sandya Subramanian from the Johns Hopkins University; and Jorge A. González-Martínez, Zachary Fitzgerald, and Matthew A. Johnson from the Cleveland Clinic. Uri T. Eden from Boston University and John T. Gale from Emory University also were co-authors.

The National Science Foundation and the Kavli Neuroscience Discovery Institute at Johns Hopkins University paid for the study.

December 18, 2018

From Hopkins to Silicon Valley

Bugrahan Cigdemoglu graduated from Johns Hopkins in 2017 with his bachelor’s degree in biomedical engineering and computer science. Now working as a software engineer in Silicon Valley, Cigdemoglu offers career advice for current students looking to prepare for their future.

Bugrahan Cigdemoglu WSE ’17, had two major goals before coming to Johns Hopkins University–work in Silicon Valley and achieve immortality. He accomplished his first goal when he was able to turn his summer internship at DNAnexus, a biomedical informatics and data management company in Mountain View, into a full-time offer right before his senior year. He’s currently a software engineer for the company.

“The skillset that I gained from Hopkins got me a job in Silicon Valley,” he says. “Being a [Johns] Hopkins graduate, you’re ahead of some people. It developed my character in a way that allowed me to succeed in the United States, which is very competitive for international students.” Now, Bugrahan is recruiting for DNAnexus at Johns Hopkins, where he loves meeting with younger students and finding great candidates.

Bugrahan Cigdemoglu
Bugrahan Cigdemoglu, Comp Sci and BME ‘17

How I Got my Job at DNAnexus

I got my DNAnexus internship through the Career Fair at Hopkins. I handed my resume to Sean King, WSE ’14, who is now my manager. Michael Beer, associate professor of biomedical engineering, told me it was a growing company in a good industry. He gave me perspective of where the genomics industry was going, and now I see that opportunities are going to grow exponentially. At the end of my internship I asked Sean to give me a recommendation as an employee, which led to a full-time job offer.

What My Job is Like

I use skills from both my majors. My education in biomedical engineering helps me understand the company’s vision and product, and computer science helps me develop the product. DNAnexus is a cloud-based biomedical informatics and data management platform. Companies upload multi-omic (genomes, epigenomes, etc.) and clinical information and run analyses. As a back-end software engineer, I make sure the platform is secure, fast, clinically compliant, and that we provide new features for customers pursuing genomic-based approaches to health. The majority of work done with our platform goes into managing a global and collaborative network to accelerate science and discovery across a spectrum of industries: pharmaceutical, bioagricultural, clinical diagnostics, medical centers, and government.

What Makes my Work Rewarding

Most Mondays it’s as if the industry has evolved completely. I love that! In the coding world, you build upon code that other people wrote on a daily basis. It grows exponentially and you have to keep up with it. Once, a customer requested a feature to do a test that couldn’t be done on any other genomic analysis platform. When we first created our initial object to solve their problem, I wrote the I.D. number on a sticker, and I’ve kept it on my desk. Knowing that using that object, DNAnexus will provide access to massive genomic datasets that will help cure incurable diseases or help babies be born healthy, was a rewarding feeling.

What JHU students Can Do to Prepare for Internships and Jobs

  • Reach out to alumni. Use social media platforms to job search, and get in touch with people who are currently working at companies. That’s how I got my internship, and students reach out to me, too. Cathy Jancuk, the biomedical engineering undergraduate program manager, is a great resource for students to get in touch with alumni.
  • Follow up with the alumni. The follow-up shows who is really passionate about the job, and who is just applying everywhere. Even after I got my full-time offer, I did my best to keep up the relationship with my manager.
  • Pick one club and make an impact. Rather than listing 20 clubs, it’s a good idea to have one solid experience with one club–leadership, accounting, or some evident impact on the club.
  • Use your club to make connections. The number one thing I did when looking for internships was reaching out to alumni of the Turkish Students Association. Now I’m consulting for friends and current students, and introducing them to people who might be able to answer their questions about a specific company.
  • Make time to form better bonds with professors. Eventually you will need to consult them on their research and their expertise in their fields.

How to Make the Most of Internships

Be a true team member
Learn to work as a team member. We always emphasize teamwork at Hopkins, but it’s hard to understand its significance while doing short-term assignments. Once you become a member of the team in an internship, observe what other people are doing in the team and see how each person contributes to the team, how you can fit in, and contribute to it. For example, I am on the social side, so I like to keep the team motivated around the goal.

Work specifically on gaining skills
Learn as many tools as possible. Ask questions about what the tools do and what the tech does. It’s a great way to get a head start. Your impact becomes more important than skills once you become full-time, but as an intern, focus on developing your skillset.

What type of JHU students stand out to you as a recruiter?

At the Fall Career Fair this year, it was nonstop talking to amazing candidates from Hopkins. The students who were prepared for the fair, who knew about the company, performed much better. The candidates we’re interviewing now already knew about the company and what skills they would need, even in a vague way. One student has already used the DNAnexus for 5-10 minutes. I knew right away that student would get an interview. That student had also attended a Pre-Fair Prepare session from the Career Center. Those students knew what to do—their resumes were well organized.

This article first appeared on the Johns Hopkins Student Affairs website.

Sweet Sensation

When graduate student Luke Osborn needed to test the fingertip sensors he’d spent years developing for prosthesis wearers, he didn’t have far to look. The ensuing collaboration and results hold big promise for amputees.

In a Clark Hall lab one day in the fall of 2016, Luke Osborn, MS ’14, PhD ’18, attached two tiny beryllium copper probes to the left arm of his fellow graduate student György Lévay, MS ’17.

Lévay had lost most of that arm—along with his right hand and both feet—six years earlier, when a severe systemic infection turned his extremities necrotic. Now, he and Osborn were both students in the lab of Nitish Thakor, a professor of biomedical engineering and one of the world’s foremost innovators in prosthetic devices.

Osborn’s copper probes were part of an audacious project—one that he and a group of fellow graduate students had been working on for three years. The goal: to give amputees the ability to feel sensations in the fingertips of their prosthetic hands.

Under Thakor’s guidance, Osborn had meticulously developed fingertip sensors that mimic the architecture of the neurons in human skin. He had tested the sensors on robotic benchtop prosthetic hands for months, training them to respond to various stimuli.

Graduate student Luke Osborn created fingertip sensors for prothesis wearers.

Now it was finally time for Osborn to test his sensors on a human subject. Lévay, who conducted his own separate work on the opposite side of the lab, was glad to volunteer.

In the movie version of this scene, Lévay might be overcome with emotion as he feels sensations from his fingertips for the first time in six years. The reality was far more imperfect, laborious, and hit or miss, as Lévay—a prosthetics technology scientist himself—well knew that it would be.

“My job was just to sit there and tell Luke what I was feeling at any given moment,” Lévay says. “In the beginning, most of the sensations were just electrical irritations, kind of like licking a nine-volt battery.”

But after hundreds of trials and dozens of hours spread over a six-month period, Osborn’s team perfected its equipment and its methods. By the end of the study, Lévay could reliably tell, without visual cues, whether his prosthetic hand had picked up a smooth object or a sharp one. When his prosthetic hand picked up sharp objects, he felt sensations in his artificial fingertips that seemed like actual tactile pain, not just electrical jolts.

The study—which was published in Science Robotics last June—marks a breakthrough in providing sensory inputs for prosthesis users. For Osborn, it was the culmination of five years of work.

Graduate student Luke Osborn created fingertip sensors for prothesis wearers.
Luke Osborn, center, and Nitish Thakor, right, in the lab, where testing of the prosthesis began in 2016. (Image: Will Kirk / Homewood Photography)

“Part of the beauty of the field of biomedical engineering,” he says, “is that it’s getting stronger on both the medical and the engineering sides. This is exactly the kind of work that I hoped to do when I came to Hopkins.”

Thakor calls Osborn’s project one of the most impressive he has ever supervised. “Luke built the sensors, wrote the algorithms, and designed the experiment,” he says. “He had the experience and the competence to do all of this, and he put it all toward solving an exciting problem. It came together beautifully in the end.”


Osborn arrived in Thakor’s lab in the fall of 2012. As an undergraduate at the University of Arkansas, Osborn had primarily been interested in pure robotics. But by the time he started graduate school, he wanted to do work that had medical applications. Thakor’s lab seemed like the perfect fit. For more than 25 years, Thakor has worked on developing better methods for controlling prosthetic limbs, both at Johns Hopkins and at a companion lab in Singapore.

Osborn quickly turned his attention to the problem of sensation. There have been many improvements in control systems for prosthetic devices during the last decade, but few attempts have been made to allow amputees to feel touch signals from their prosthetic limbs.

“Touch is really interesting,” says Paul Marasco, a sensory neurophysiologist who works on bionic prosthetic devices at Cleveland Clinic. (He was not involved in Osborn’s project.) “The individual touch sensors in the skin don’t really provide you with a cohesive perception of touch. The brain has to take all that information and put it all together and make sense of it. The brain says, ‘Well, OK, if I have a little bit of this and a little bit of that, and there’s a little bit of this sprinkled in, then that probably means that this object is slippery. Or this object is made out of metal, or it feels soft.’ None of those perceptions map simply onto a single type of sensory neuron. All of them require the brain to integrate data from several different types of neurons. That’s one reason why sensation has been such a hard nut to crack and why there are so few labs doing it.”

Osborn was determined to try. He began his project in 2013 by looking for sensor materials that would be flexible enough to fit smoothly over the fingertips of a prosthesis but tough enough to withstand repeated contact with diverse objects. After several rounds of trial and error, he developed a rubber fabric that encases two layers of small silicone-based piezoresistive sensors. The team calls this fabric the “e-dermis.”

The two layers of the e-dermis mimic the two primary layers of human skin: the epidermis and the dermis. On the outer, “epidermal” layer, Osborn’s team designed certain sensors to behave like human nociceptors, which detect noxious, painful stimuli. In the deeper, “dermal” layer, the sensors mirror four types of neurons known as mechanoreceptors, which variously detect light touch and sustained pressure.

“It’s a pattern that’s biomimetic—a sensor array that matches what our nerve endings are used to,” Thakor says. “Luke’s team made a meticulous effort here to get the patterns right.”

As he developed the fingertip sensors, Osborn initially performed benchtop experiments using a prosthetic hand that was not attached to a human user. In these purely robotic trials, he developed two reflex responses that mimic human spinal reflexes. First, he trained the hand to tighten its grip if the fingertip sensors told it that an object was slipping. Second, he trained the hand to automatically drop a painful object.

The challenge here was speed: Human spinal reflexes operate within 100 to 200 milliseconds—think of how fast you react to a hot stove—and Osborn’s team wanted to match that rate. To accomplish that, the prosthetic hand had to correctly determine within just 70 milliseconds that it was grasping something painful.

“We were able to achieve that quick decision by looking at a few key features of the pressure signal from the e-dermis,” Osborn says. “These features include information such as where the pressure is located, how large the pressure is, and how quickly the pressure changes.”


Gyorgy Levay
György Lévay, at Johns Hopkins on a Fulbright, was happy to serve as a test subject for Luke Osborn’s experiments.

By late 2016, with his benchtop studies complete, Osborn was ready to begin testing the e-dermis on human participants. He turned first to fellow grad student Lévay, who had arrived at Johns Hopkins in 2015 on a Fulbright scholarship. As a master’s degree student in biomedical engineering, Lévay worked on pattern recognition systems that give prosthesis users better control over their limbs’ movements. (Lévay is just one of several prosthetic limb users who have studied in Thakor’s lab over the years.)

Osborn asked Lévay if he might be willing to be a test subject for his study of painful stimuli. Lévay said he was absolutely game—particularly since Osborn wasn’t planning to implant electrodes in Lévay’s skin, an approach that some labs have used with other prosthesis users.

Lévay volunteered dozens of hours of his time—an hour here, an hour there—during the final semester of his own master’s degree program.

The first step was an extended period of sensory mapping. Osborn needed to discover exactly the right locations to place the probes on Lévay’s residual limb. At most locations, Lévay simply felt electrical irritation or stinging on the residual limb itself and didn’t perceive any sensations from his prosthetic hand. But at a few sweet spots, which Osborn discovered through many hours of trial and error, Lévay’s residual nerves could perceive electrical stimulations from the phantom hand only in the phantom hand itself.

Luke Osborn
For Osborn, five years of work have culminated in artificial fingertips that allow prosthesis users like Lévay to discern sharp from smooth objects and to feel pain.

This is possible, Osborn explains, because the electrical signals he uses are very gentle. “The current that we’re using is small enough that it wouldn’t typically be perceived by the surface of the skin at the site of stimulation”—that is, the point where the stimulation is attached to Lévay’s residual arm, he says. “But some of the nerves underneath the skin do detect the signal, and they interpret it as a signal from the hands that they’re going to send upstream to the brain.”

Once the sensory mapping was complete, Osborn’s team was able to start working on the heart of the study. As Lévay’s prosthetic hand grasped smooth and pointed objects, Osborn adjusted the programming of the system, assessing how and where Lévay was perceiving pain sensations. (The desk was set up so that Lévay couldn’t see what his prosthetic hand was doing. He didn’t have any visual cues about whether he was grasping smooth objects or sharp ones.)

Osborn could adjust three primary variables: frequency, amplitude, and pulse width. The goal was to create a “neuromorphic” signal that mirrors the complexity of our perception systems.

By the end of the study, Lévay says, he was able to perceive a wide array of touch sensations in his phantom hand. “Some of them were like someone was pressing on my hand or like a pulsating of blood. Some of them were very interesting stuff.”

Osborn and team developed an “e-dermis” for the artificial fingertips, which mimic the two primary layers of human skin.

Over the course of more than 150 trials, Osborn developed a complex algorithm that gave Lévay a reasonably accurate set of pain perceptions from the prosthetic device. The locations of the pain perceptions were never as pinpoint-specific as an intact person would have experienced—nor were they ever expected to be. But Lévay could correctly report whether the pain was occurring along the median nerve (the region of the thumb and index finger) or the ulnar nerve (the pinkie). Electroencephalogram studies confirmed that the signals were activating regions of Lévay’s brain that corresponded to the phantom hand.

Throughout the project, Osborn checked in with Thakor at weekly research meetings. The team also included Andrei Dragomir, a senior research fellow at the National University of Singapore; Whiting School doctoral students Joseph Betthauser ’14 and Christopher Hunt ’14; and Harrison Nguyen ’18, who helped design and test the final iterations of the fingertip sensors.

As the youngest member of the team, Nguyen says that he had a valuable experience working with Osborn and Thakor. “Depending on where you are in your training, Luke can be very supportive and hands-on,” he says. “And once you’re more capable, he’s glad to be more hands-off. He’s always willing to talk through problems in the lab.”


Graduate student Luke Osborn created e-dermis for prothesis wearers.You might imagine that the dozens of hours they spent sitting together in the lab would have allowed Osborn and Lévay to talk shop and to exchange ideas about their mutual interest in improving prosthetic devices. But it wasn’t quite like that: To maintain the integrity of the experiment, it was crucial for Lévay to be blinded to many of the questions Osborn was trying to answer. When Lévay described what a stimulus felt like, Osborn wanted his description to be based purely on what he was feeling, not biased by any knowledge of how Osborn was programming the system.

“Luke went to surprisingly painstaking lengths to make sure I didn’t know what he was up to,” Lévay says. “For instance, the stimulator had a little red light on it that blinked every time a stimulation was sent. So if I’d really watched it, I could have deduced the frequency of the stimulation. Luke taped it off so that I couldn’t see it. His screens were always hidden away, and I could only look in a certain direction. So, yeah, it was hard, because I was really interested in what was happening.”

This was particularly frustrating, Lévay says, “when there were sensations that I liked a lot. I would be like, ‘What were these?’ and Luke would say, ‘I can’t tell you.’ This lasted for more than a year while I knew basically nothing about what was happening. Of course, we were working in the same lab, so it was that much more difficult. We made sure that we worked on opposite sides of the lab so that I wouldn’t overhear anything accidentally.”

The multilayered e-dermis is made up of conductive and piezoresistive textiles encased in rubber. A dermal layer of two piezoresistive sensing elements is separated from the epidermal layer, which has one piezoresistive sensing element, with a 1-mm layer of silicone rubber. The e-dermis was fabricated to fit over the fingertips of a prosthetic hand.

Since completing their work with Lévay, Osborn’s team has tested sensory perceptions with several other amputees in the Clark Hall lab. To varying degrees, they have all been able to perceive accurate sensory signals from their phantom limbs. One question going forward will be how much the nature of the initial injury affects prosthetic sensory systems. A person who loses a limb in a military conflict, for example, might have different kinds of nerve damage in the residual limb than a person who loses a limb from septicemia or from a motor vehicle accident.

“Some of the crucial factors,” Lévay says, “are the level of skin degradation that occurred. Is the skin that remains on the individual sensitive? Is it well-vascularized? Did the nerves grow back into the muscles?”

Osborn, who completed his PhD last summer, hopes to continue working on prosthetic technologies throughout his career. “Luke’s work on sensory input is absolutely the way of the future,” says Rahul Kaliki, the CEO of Infinite Biomedical Technologies, a prosthetics-centered firm that spun off from Thakor’s lab in 1997 in partnership with his former student and co-founder, Ananth Natarajan MSE ’98. “Sensory feedback is one of the crucial things that has been missing from prosthetic limbs.”

Students in Thakor’s Johns Hopkins lab are working on a wide variety of strategies for improving prosthetic devices. In partnership with Infinite Biomedical Technologies, they are developing high- density electrodes for sensing muscle signals and radio-frequency identification systems that allow prostheses to recognize tagged objects—like the user’s personal coffee cup—and to automatically prepare to grasp them. The lab was recently awarded a major grant from the National Science Foundation to develop sensors and algorithms for discrimination of texture and shape.

“Lots of robotics labs have developed sensors in the last few years,” Thakor says. “And in their proposals, they always say, ‘This could have applications for prosthetics.’ But they almost never actually do the work to make the sensors useful for amputees. That’s one reason I’m so pleased with what Luke has done.”

Graduate student Luke Osborn created e-dermis for prothesis wearers.Osborn, for his part, says he is grateful for the many hours volunteered by Lévay and the other participants in his studies. “None of what we do would be possible without the interest, dedication, and willingness of participants to come and work with us,” he says.

Today, back in his native Hungary, Lévay works remotely as a research director for Infinite Biomedical Technologies. He is continuing to refine his pattern recognition systems for improving users’ control of prosthetic arms.

He knows from personal experience how high the stakes are. “For people who have lost limbs, the expectation is very high,” he says. “If someone gets a prosthesis, what they want is what they lost. And we’re quite a distance away from that. People are not happy at all with the products we have. But that’s what’s prompting further development and research—and results like what Luke has achieved.”


– David Glenn // Illustrations by Mark Allen Miller