Safety First: Building a Resilient Future

Spring 2019

Safety First: Building a Resilient Future

Of all the remarkable things engineers do for humanity, none may be more important than the ways in which they improve our resiliency, keeping us safe from the many potential harms the world has in store. Among the Whiting School faculty, there is no shortage of engineers striving to make the world a safer place. In fields ranging from surgery to structural engineering, their approaches are as creative as they are promising.

 

Beating the Burn

Beating the BUrn“Fire is fascinating to me,” says civil engineer Thomas Gernay, assistant professor of civil engineering, one of the Whiting School’s newest faculty members, and head of the Multi-Hazard Resilient Structures Group. “It’s been a threat to people as long as there has been human civilization. It’s just one of the fundamental challenges in engineering.”

Gernay, who hails from Belgium, specializes in designing buildings that are better able to avoid or withstand the ravages of fire, among other natural and man-made threats. He is perhaps best known for co-developing SAFIR®, a software application that helps structural engineers and architects model and predict how their designs will respond to fire.

Gernay says SAFIR is an excellent resource for structural engineers looking to use new materials and for architects who want to push the boundaries of design, allowing them— long before construction begins—to accurately model how as-yet-unproven concepts will react in a real-world crisis.

Gernay notes that while building codes are important, they often are obsolete or too basic for today’s complex world. In contrast, computer simulations can model the performance of entire buildings under extreme hazards. As an example, he points to the World Trade Center towers, in which the buildings’ steel frames were blanketed in thick fire insulation.

“They were completely up to code,” Gernay says, “but September 11 was not a normal day.” When the planes crashed into the towers and dislodged the insulation from their steel frames, the bare metal was exposed to direct heat, leading to their failure.

In his research, Gernay is exploring new alternatives to traditional structural fire protection, like engineered assemblies that employ innovative materials, as well as variations on old ones, such as robust systems using concrete-embedded steel. He says there is even a movement in structural engineering to return to traditional wood framing in order to address sustainability issues.

“What’s old is often new again, but [this raises] fresh challenges for engineers,” Gernay says.

 

Calming Seizures

Calming SeizuresFor the many thousands of people with epilepsy who do not respond to medication, life is a roller coaster in which any moment could bring another seizure. Recently, medical science has begun to look deeper into areas in the brain—known as epileptic foci—where seizures are believed to originate.

Sometimes, these foci are caused by structural malformations. Other times, a tumor produces the seizure. And yet, for other people, there is no known physical explanation, but seizures continue to emerge from these locations nonetheless.

“The foci are the epicenters of epilepsy—the heart of the earthquake—but they are not always easy to delineate,” says Archana Venkataraman, the John C. Malone Assistant Professor of Electrical and Computer Engineering and a member of the Malone Center for Engineering in Healthcare, who studies these elusive phenomena. “We think engineering can help in that definition.”

The good news is that epileptic foci can be surgically removed to reduce or eliminate seizures. But, like any brain surgery, accuracy is paramount. Removing too much tissue can cause severe consequences. Removing too little tissue subjects the patient to serious surgery with little or no benefit.

According to Venkataraman, current methods to identify epileptic foci are based on the eye and the instinct of neurologists and radiologists, a process that is time consuming, requires years of training, and is prone to human error. So she’s employing machine learning, a branch of artificial intelligence, to automatically pinpoint these epileptic foci by using brain imaging and electrical monitoring technology—magnetic resonance imaging and electroencephalography, in particular.

In essence, these techniques offer moment-by-moment snapshots of the brain’s physical and electrical activity, like a stop-motion film. Using the method Venkataraman developed, computers examine and compare those snapshots and train themselves to spot patterns that are not always apparent, even to experts. With this information, she triangulates the precise location and size of epileptic foci.

Best of all, her approach is noninvasive, reducing risk for the patient, while promising greatly improved surgical precision. Venkatraman and graduate student Jeff Craley have developed a prototype seizure localization algorithm for EEG data acquired in the clinic; they are now working to make it more reliable for all epilepsy patients. Venkatraman is also seeking additional data and funding to take her work to the next level, where it will help real patients in real need.

“The most rewarding part of this project is that I can develop cutting-edge engineering tools to directly impact people’s lives,” she says.

 

Robots to the Rescue

Robots to the RescueWhile time is of the essence after an earthquake, a hurricane, or an explosion, the resulting debris fields are often too treacherous or toxic for humans to enter.

Mechanical engineering assistant professor Chen Li imagines robots that comb debris-strewn disaster sites in a race against the clock to locate survivors. For inspiration, he is looking in new directions. “We don’t get our ideas from science fiction, but from nature,” Li says of the overarching goal of his work.

He calls it terradynamics—the study of how animals move across complex terrain. After all, Li asks rhetorically, what creature can better crawl through radioactive debris—like that after the tsunami in Fukushima, Japan—than a cockroach? What creature, other than a snake, could better slip through the crevices among the concrete and steel, following an earthquake?

Li, who heads the Terradynamics Lab in WSE’s Laboratory for Computational Sensing and Robotics, is an expert in studying how such creatures move. “We are not trying to copy exactly what these animals do. We want to learn their key advantages and build those into our prototypes,” Li explains.

For instance, he has created a cockroach-inspired robot prototype that can scale complex terrain with its many legs. It even has wings, just like the real thing, that can help the robot right itself, should it topple on its back.

But of all the cockroach’s many advantages, Li says, one overlooked key to its remarkable mobility is its almond-like shape. This streamlined form is both protective and unobtrusive, allowing the cockroach to deflect falling debris and to slip past obstacles without getting snagged.

Li’s efforts to understand snake locomotion, on the other hand, have proven a bit more challenging, he says.

But he remains unbowed. His latest tack in trying to create a prototype is a snakelike robot with a series of ratcheting wheels on its underbelly that roll easily forward, but not back or sideways. Though snakes have scales instead of wheels, the effect is the same, helping the snake robot to grip and propel itself forward through even the most difficult terrain.

While Li clearly has designed his animal-inspired robots with search-and-rescue operations in mind, he thinks his robots could have implications for many other fields, including structural and environmental monitoring, and even planetary exploration.

“Since I was a kid watching David Attenborough documentaries, I was always intrigued by animals,” Li says. “I’m fortunate to get to turn that into a career.”

 

Quantifying Surgical Expertise

Quantifying Surgical ExpertiseThe last two decades have witnessed the ascent of medical procedures in which the surgeon’s hand is guided, at least in part, by robots and computer algorithms. Many times, the surgeon does not operate directly on the patient, but while sitting across the operating room peering into sophisticated video monitors and manipulating remote-controlled instruments that tell the robot what to do. Such advances have made certain surgeries far less invasive and have reduced the chance of human error.

While these benefits are noteworthy in their own right, this technical evolution has yielded an unexpected upside: a profusion of data about the surgeons themselves.

“Each incision, every suture is recorded to an incredible degree of accuracy,” says Greg Hager, the Mandell Bellmore Professor of Computer Science and director of the Malone Center for Engineering in Healthcare.

About 15 years ago, Hager decided to put that data to good use to quantify the techniques of master surgeons in order to help students gain expert skills. Hager counts among his collaborators many fellow engineers, as well as surgeons and biostatisticians.

Hager has discovered that surgery is not a collection of grand and complex motions, but rather a series of discrete, definable, and—most importantly—teachable smaller submovements. What’s more, he has now cataloged all these smaller movements into a sort of dictionary of surgery. He calls these submovements dexemes—a term related to the linguists’ word “phoneme” that describe the discrete sounds used in spoken language. Instead of a dictionary of sound, Hager created a dictionary of motion.

Those mathematical representations can be used in training devices for future surgeons to practice on without risk to patients: They can hone their skills before taking on the challenge of real surgery.

In recent years, Hager has moved beyond data-intensive robotic surgery into surgeries that are not so quantifiable, namely the scads of video images captured during endoscopic surgeries.

A decade and a half into his pursuit, Hager is eager for the challenges that lie ahead. He’s working several ancillary projects, including surgical simulators that use machine learning to teach basic skills.

“These computers are true mentors,” he says. “I’m excited to bootstrap this and move it into the real world.”

 

Software for Hard Hats

Software for Hard HatsA computer scientist and inveterate entrepreneur, Anton “Tony” Dahbura ’81, PhD ’84, took note one day when his son shared a growing concern about workplace safety.

“My son, who owns a demolition company, said, ‘Dad, sooner or later someone’s going to get smushed.’ And I got to thinking how we might solve the problem,’” recalls Dahbura, executive director of the Johns Hopkins Information Security Institute.

The elder Dahbura borrowed a few off-the-shelf Bluetooth signaling devices—known as iBeacons—and mounted them atop workers’ hard hats. The beacons “ping” periodically with a radio signal unique to each hard hat. Dahbura then placed a network of Bluetooth receivers around the exterior of an excavator.

Next, the computer scientist in him took over. Dahbura recruited students in the Department of Electrical and Computer Engineering and in the Center for Leadership Education to help him write software that triangulates the positions of each hard hat around heavy equipment—such as excavators, cranes, and trucks—and plots them on a monitoring screen mounted in the equipment’s cab. If a worker gets too close, an alert sounds, and visual cues flash on screen. The system, dubbed Blindside, is like air traffic control for the work site.

In July 2018, Dahbura and two colleagues were awarded U.S. Patent 10,026,290 for Blindside. He has since developed kits to help heavy-equipment manufacturers retrofit existing equipment and is working with them to integrate Blindside into new vehicles. Recently, government regulators and insurance companies have taken note. Dahbura’s hope is that Blindside will become mandatory at all job sites.

“The financial consequences of a workplace accident can put a small company out of business,” Dahbura says. “But that’s nothing compared to the consequences of losing a life.”