A Question of Ethics

Fall 2008

Illustration by Michael Gibbs
Illustration by Michael Gibbs

As technological advances lead to new materials, methods, and opportunities, Johns Hopkins engineers find themselves grappling with limitless possibilities and unexpected challenges.

For thousands suffering from Parkinson’s disease, an advance in biomedical engineering known as deep brain stimulation appears to be a godsend. The procedure uses an electrode inserted into the brain to help calm Parkinson’s often debilitating tremors. “There are striking images of people who’ve received the procedure,” says Murray Sachs, recently retired director of Biomedical Engineering. “The tremor goes away like that ,” he says, snapping his fingers.

But it’s also, potentially, something far less beneficial. Deep brain stimulation works by exciting the dopanergic regions of the basal ganglia, where neurons are dying. “If you can stimulate the dopanergic region, you can stimulate anything ,” explains Sachs. “What if it causes psychological changes? What if it causes learning problems? At what point do you pull the stimulation?” These aren’t just hypothetical questions for Sachs. He himself suffers from Parkinson’s.

The ability to control a person’s behavior via an electrode could be both a wonderful cure and a potentially horrible crime, he points out.

“This is a problem,” Sachs says. “It’s not in the future. It’s right now.”

WE LIVE IN AN AGE when technological advances occur with lightning swiftness. But there’s a crucial element over which technological strides have sometimes leapfrogged. Though A Question of Ethics it’s not as obviously critical to engineers as stress tolerances or reactive properties, the role of ethics—the analysis of right and wrong and the gray area in between—is just as vital.

“Engineers are supposed to be building things to make life better,” notes Allan Bjerkaas, associate dean for Engineering and Applied Science Programs for Professionals (EPP). “In our society now, where we are building things that could have unexpected impacts on our lives, we need to think clearly about how to do that safely.”

Ethical dilemmas are hardly new to the field of engineering. One need only look back to the waning years of World War II, when a small group of engineers found themselves with an ethical question of unparalleled import: “Should we build and detonate an atomic bomb?” That fateful technological leap—in which the science predicted and developed by physicists was put into real-world practice marked the beginning of the modern era of engineering ethics. “There is a shadow over engineers that says, ‘You don’t pay enough attention to social and moral issues,’” says Sachs. “For 16 years, my colleague Eric Young and I had Friday dinners with our wives and four children. When those children were younger, most of the dinners would be spent with them accusing us of being perpetuators of the atomic bomb.” He pauses, then adds, “Interestingly, two of those kids ended up as scientists.”

In the years since the advent of the atomic bomb, ethical oversights within engineering— whether deliberate, or compounded by negligence and lack of communication—have proved fatal, often casting a pall on the profession’s public image that has taken years to restore. The Ford Pinto’s fiery design flaw of the 1970s. The 1981 Kansas City Hyatt/ Regency walkway failure. The 1986 space shuttle Challenger explosion. The 2007 Interstate 35 bridge collapse in Minnesota.

Disasters like these still weigh heavily on the minds and thoughts of engineers today. They know that, ultimately, someone did something wrong that led to these mistakes, whether it was ignoring warnings, performing substandard work, or bowing to corporate financial pressures. “When Ford manufactured the Pinto, knowing that the gas tank could rupture and explode at a low-speed impact, the public might have asked why the engineers allowed this to happen,” says Glenn Rahmoeller, an engineer who has taught ethics for a decade (including four years at Hopkins) and who currently lectures in the EPP program. “When a bridge falls down, society loses confi – dence in [engineers]. People ask, ‘How could this happen?’”

What’s upped the ante today is the increased pace of technological advances. “As engineers advance into unprecedented territory, we face increasing ethical dilemmas,” says Whiting School dean Nick Jones. In fields like biomechanical engineering, nanobiotechnology, and information technology, new materials and modes of data interaction that didn’t exist even a decade ago are being created and implemented with breathtaking speed.

In this ever-shifting landscape, when there’s often no telling where innovations will lead, the need to consider ethics has never been greater for engineers, notes James G. Hodge Jr., of the Center for Law and the Public’s Health, a collaborative center at both Johns Hopkins and Georgetown universities.

“There’s always room for technology to surpass what we perceive as possible,” he says. “Letting technology speed ahead of the ability to assess impacts-that can be dangerous.”

One of the fields in which safety is an ever-present and growing concern is nanobiotechnology—a rapidly emerging discipline that unites biotechnology with nanotechnology. Researchers already use mate – rials at the nano scale (from 100 nanometers down to the level of individual atoms) in everything from sunscreen to water filtration systems to stain-resistant slacks. Creating material at that small a scale presents both enormous opportunities and innumerable questions, notes Marc Donohue, vice dean for research.

“First, nano is important not because it is smaller but because, in the nano region, fundamental physical properties are different,” Donohue notes. “We don’t know how to predict what they are. The biological properties are different too. Technology has gone beyond our scientific understanding of the implications of this.”

“If we are developing new technologies, it’s critical that we are also looking at the societal impact at the same time they are being developing—not after they have been released.” Peter Searson

“There’s an interesting duality in this area,” says Peter Searson, the Joseph R. and Lynn C. Reynolds Professor. “We need to be cognizant of nanoscience’s potential public health issues but the flipside is that the science and tools that come out of this endeavor have beneficial impact and can solve problems.”

Another issue that compounds the difficulty of nanobiotechnology research is its broad impact across the physical spectrum. “It’s an incredibly multidisciplinary problem,” says Searson, director of the Johns Hopkins Institute for NanoBioTechnology, which brings together some 162 faculty, staff, and researchers from Engineering, Public Health, Medicine, Arts and Sciences, and the Applied Physics Laboratory. “It’s the complexity of the problem now that distinguishes it. If you want to understand how a nanoparticle will interact with a cell, you need to understand the cell. There’s the composition of the particle, the shape of the particle, its size. If we keep going, there are the various interac – tions we need to consider: What does the cell see as it comes into contact with the nanoparticle? It will respond in different ways to different biochemical cues. It requires scientists and engineers with very diverse backgrounds to address these issues.”

To illustrate Searson’s point, consider the current example of sunscreens that use titanium dioxide nanoparticles to make the lotion more effective at filtering out harmful ultraviolet rays. So far, so good.

But when the person wearing that sun screen takes a shower, those nanoparticles are washed off the skin and into the public sewer system. At that point, the civil engineers working at the water treatment plant downstream are now going to be handling the nanoparticles: They need to know that they are coming, understand the science and policy issues associated with them, and prepare the water system. How will the particles affect the environment? The material has jumped into a discipline that isn’t immediately obvious, and that’s part of the challenge of nanobiotechnology.

This complexity is a challenge both in the lab and in the real world because the public, too, has to be educated about it. “This isn’t something a research group can work on for six months and come up with all the answers,” Searson continues. “There’s an almost infinite number of combinations. We have a huge matrix and a clamor for one single answer. We need to make the public realize this is a very complex subject. We have to find more effective ways of conveying that.”

The exponential increase in the amount of—and access to—information that the Internet gives society has presented computer and software engineers with their own unique set of challenges.

“I’ve been quoted as saying, ‘When the Internet was created, almost everybody naively assumed that people are going to play fair,’” says Gerry Masson, director of the Johns Hopkins University Information Security Institute (JHUISI).

For Masson and his colleagues their nemeses are not physical, like rust, water, or weight: The enemies they are fighting are other human beings, malevolently intent on bypassing securi – ty measures and stealing personal data and information. “Anything as complicated as the Internet has major flaws in it, and those flaws can be exploited,” Masson says. “A lot of really great software can be used as a weapon. The designers never thought of that possibility.

Question of Ethics2

“The information security area is interesting,” he continues. “There’s a neat aspect, which is that you have to tell everybody what you’re doing. And you almost have to invite people to see if they can come up with a way to get in.”

And if they do? JHUISI is working on developing answers to that ethical question as well. “We’re looking at the ethics of discovering flaws and what you do with that information,” Masson says. “Let’s say I discover that my smart key [a purely electronic key] can be used to get into any Toyota Prius. Do I tell Toyota? Do I make this info known?”

The answer, according to JHUISI, is yes. Take two recent examples of software issues made public by JHUISI faculty. In what may be their most celebrated case, Computer Science faculty member Avi Rubin and other researchers at Independent Security Evaluators (a private company founded by Rubin) were the first to find a way to hack into Apple’s popular new iPhone, allowing outside entities to take control of the device. The researchers immediately alerted Apple about the vulnerability, and even created a software patch that could solve the problem that they were ready to hand over to the company. Rubin and his colleagues also gained national attention after they revealed serious security flaws and lapses in electronic voting machines.

“When you find a vulnerability, it’s kind of naïve to think that someone else won’t discover it as well. If you think you’re the only person who finds a flaw, that’s arrogant,” says Masson. “What you have an obligation to do is to identify the proper channels and let the information be known.”

Is engineering education keeping up with the need to equip a new generation of engineers with an ethics-focused approach to their work and research?

“Three or four years ago, the answer [for BME graduate students] was absolutely not,” says Murray Sachs. “We did a terrible job for many years. We used to have a graduate student retreat once a year, at a place chosen by the students.” For about a day, the students would break into groups and discuss examples of ethical challenges. And that was it.

“Then, we, as a department, mandated to teach ethics,” he says. “The NIH also mandated it.” Now, the topic of ethics is infused into courses within each department, and plans are being developed to increase courses available to students going into fields in which ethics will play a guiding role. More lectures are devoted to getting engineers to consider the impact of their work, and to talk about concerns and questions. Sachs himself has plans to debut a course next spring for graduate students; it will take place off campus, in a relaxed environment aimed at getting students to open up, he says.

In the lab, Hopkins Engineering faculty have already started to increase interdisciplinary collaboration in hopes of increasing the quality of research while minimizing unforeseen consequences (like those raised by Searson mentioned in the example of nanoparticles in sunscreen).

“When engineers come up with a new technology, they ought to talk to people in other disciplines early on,” says Rahmoeller. “Talk to sociologists and experts from the scientific disciplines that will use the technology. Study the long-term effects instead of waiting for a problem to occur many years down the road. In the case of nanotechnology, for example, some 95 percent of the research budget is spent on studying the potential benefits and only about 5 percent on the potential harm to individuals and society. There should be a greater balance in the funding of this research.”

“If we are developing new technologies,” agrees Searson, “it’s critical that we are also looking at the societal impact at the same time they are being developing—not after they have been released.”

For five years, Hodge has been teaching ethics to engineering and information technology undergraduate and graduate students at Hopkins. “I find that when I get them out of their element, this is a tough class for them,” he says. “The class requires them to step away from their world of programming and design, and be placed in a world that is focused on people who are affected by their programs.

“My mentality is not to limit technology for the sake of limitation,” says Hodge. “That’s not realistic. It’s about smart use. It’s about anticipating potential impacts. Technological innovations can lead to tremendous health benefits. But let’s not create unintended, negative public health or health consequences along the way.”

“It’s helpful for an engineer to sleep at night if they know the overall picture, know what’s out there,” Bjerkaas adds. “And to know what they don’t know.”