Grave New World

Over the past decade, the burst of new technologies has been breathtaking—and often revolutionary. Pilotless drones track human footprints to help locate bombing targets. Tiny molecular robots made from DNA seek out and destroy cancer cells, leaving healthy cells intact. Brain implants enable humans to control prosthetics by merely thinking what they want them to do. Driverless cars are just around the corner.

But with these breakthroughs have come disturbing new ethical questions that challenge traditional ways of training conscientious citizen-engineers. No longer is it enough for students to be taught how to respond if a boss ignores safety standards. The engineers of tomorrow must grapple with technology that not only empowers humans with spectacular new tools but also threatens to break free of human control. How should they, for instance, view the use of drones that can mistake their targets and kill civilians? Who should control DNA robots — and decide how they’re used? Who is responsible when a driverless car runs over a pedestrian? Should hands-free cellphone use be required in designs of new cars?

Such questions call for “more than just ethics-as-usual,” says James Moor, a Dartmouth College philosophy professor who has studied the issue closely. What is needed, Moor argues, is “better ethical thinking that is more proactive, with more interdisciplinary collaboration among scientists, engineers, and ethicists.” Past game-changers – gunpowder, the steam engine, the airplane, the atomic bomb — also posed ethical dilemmas. The big difference between those earlier breakthroughs and today’s is “the incredible pace” and sophistication of scientific development, says Brookings Institution scholar Peter Singer. Technologies produced now raise “questions about issues of right and wrong which we did not have to think about before.”

Potential Abuses

Adding to the complexity is the potential for enormous impact and a convergence of technologies, with achievements in one field paving the way for advances in others. Computerization, nanotechnology, biotechnology, and robotics have changed the way naval architects and marine engineers design and build ships, for example. Nano- and biotechnology are also altering modern medicine. While promising dramatic progress in fighting disease, they have heightened fears about potential abuses across a wide range of applications, from genetic engineering of human beings to manufacturing self-changing materials that could create new creatures or cause serious damage to humans and animals.

Many new ethical issues enter a gray area between personal responsibility and public policy. Cyber technology enables governments (and individual hackers) to send out viruses that can prowl the Internet and ultimately destroy corporate files and disable nuclear facilities, as occurred when the United States and Israel reportedly developed and unleashed the Stuxnet program against Iran. Should such actions now qualify legally as acts of war? The explosion in cyber technology has raised gnawing concerns about individual privacy that weren’t even imaginable a few years ago. These range from intrusive access to personal information to techniques for state control and manipulation that conjure dystopian societies imagined by George Orwell and Aldous Huxley.

“There now are trajectories that can lead to such things, and they are plausible,” says Ronald Arkin, a computer science professor at the Georgia Institute of Technology. Already, certain technologies, such as camera-equipped pills that explore the stomach or colon, touch human lives as never before, notes Michael Mumford, an industrial psychologist who teaches ethics to engineering students at the University of Oklahoma.

Narrowly Structured Courses

Incorporating ethics training into the nation’s established engineering curricula has never been a scientific process. Since 2000, ABET, the accrediting group, has required that engineering graduates be able to demonstrate “an understanding of professional and ethical responsibility.” Professional societies have set up codes of ethics to self-police their members. Engineering schools and societies have introduced ethics centers whose missions include promoting ethics instruction, gathering data, and serving as an information exchange.

Understandably, however, many of the courses and codes of ethics are narrowly structured, designed to deal mainly with workplace-related dilemmas that engineers may encounter. Although leading professional societies have talked about ethics in the context of emerging technologies, there’s no clear trend of where ethics education is going.

Keith Miller, a professor of computer science at the University of Illinois, Springfield, says that while ABET has encouraged engineering departments to expand ethics education, the accrediting agency has also relaxed specific requirements, such as prescribing the minimum number of hours that schools must devote to ethics courses. “I worry a bit that these look better on paper than as they have actually been implemented,” he says.

With technology racing forward upredictably and with engineers operating in a global environment, revamping engineering ethics courses to deal with the new world of emerging technologies won’t be easy. Professors breaking most sharply with previous curricula have been those with backgrounds in computer science and robotics. Leaders in the ethics field say faculty members who have been trained in other engineering disciplines often seem least willing to change.

Deborah Johnson, a University of Virginia professor active in the search for ways to adapt ethics training to emerging technologies, notes that many of the questions they raise are issues for society as a whole to decide, not just engineers. “Engineers have a lot to contribute,” she says, “but it’s only a small part” of the whole. She cites other, more practical challenges: Students already must master a jam packed engineering curriculum, with little time for additional electives; teaching ethics classes holds little prestige for either engineering professors or philosophy professors; and students resist ethics classes because they’re an elective. Nevertheless, “it’s a growing field,” she says. Joseph Herkert, an Arizona State University ethics and technology professor who has been one of the leaders in expanding current ethics training, suggests that pressing engineers to become more publicly involved in ethics decisions would encourage them to learn more about the subject and interact more with communities beyond engineering.

Institutions Respond

While many educators appear slow to adapt, there are signs of change. So far, the biggest drivers have been the National Science Foundation and National Institutes of Health, which require ethics training for professors and graduate students who are seeking grants. The world’s largest technical professional association, IEEE (Institute of Electrical and Electronics Engineers), regularly sponsors conferences at which ethics education is a key topic for discussion. The National Society of Professional Engineers has established a National Institute for Engineering Ethics, while the National Academy of Engineering’s Center for Engineering, Ethics, and Society has begun a major effort to address broad ethical issues. ASEE’s ethics division played a key role in developing a new Code of Ethics for Society members. (See ASEE Today)

Earlier this year, the European Commission launched the RoboLaw Project, which brings together specialists from engineering, philosophy, law, regulation, and human enhancement to explore whether — and how — law and ethics standards should be revised in the face of advances in robotics, bionics, neural interfaces, and nanotechnology.

To Arizona State’s Herkert, the major question that engineers must help resolve is one of responsibility: Who should be held accountable for the impact of the emerging technologies? How far does an engineer’s responsibility extend? When you get to autonomous technology, he says, “it goes up an order of magnitude larger.” Brookings’s Singer lists questions not often included in professional societies’ listings: From whom is it ethical to take research and development money? What attributes should you design into a new technology? What organizations and individuals should be allowed to buy and use the technology? Which shouldn’t? “What kind of training or licensing should they have?” he continues. “When someone is harmed as a result of the technology, who is responsible? How is this determined? Who should own the wealth of information that the technology gathers? Who should not own it?

‘All Too Human’

“We must own up to these challenges, face them, and overcome them. And we had better act soon,” Singer says. “For the threat that runs through all of this is how the fast-moving pace of technology and change is making it harder for our all-too-human institutions, including those of ethics and law, to keep pace.”

Dartmouth’s Moor says the engineering profession should develop a new set of ethics for the emerging technologies gradually, neither rushing to put them into place at the start nor saving the job until “after the damage is done.” At the very least, “we need to try to be both more proactive and less reactive in doing ethics,” he says. “We need to learn about the technology as it is developing and to project and assess possible consequences of its various applications. Only if we see the potential revolutions coming will we be motivated and prepared to decide how to use them.”

Donald Gotterbarn, director of the Software Engineering Ethics Research Institute at East Tennessee State University, says universities don’t need to redesign their entire ethics programs to deal with the emerging technologies; they just need to recognize the ethics questions they pose early in the game and keep up with the challenges as the technology advances. “At the bottom, the ethical questions of engineers haven’t changed,” Gotterbarn says. “The problem is, with every new technology there are surprises, and we need to worry about them early. Convergence adds another layer of complexity and makes it more difficult for us to anticipate the consequences of a particular technology.”

Gotterbarn says engineering schools need to provide students with a broad ethical framework that they can use as those consequences begin to become clear. “We need to keep bringing up the ethics framework with every new development,” he says, “ . . . to ask, ‘Is this the right thing to do?’”

Cornell University:

The school offers a one-semester “icebreaker” course, Ethics in Engineering Practice, for undergraduate juniors. Issues posed by emerging technologies are highlighted. As a follow-up, Prof. Ronald Kline and Lecturer Park Doing help guide ethics discussions in various engineering departments. The pair updates examples and case studies regularly, drawing from news articles, scholarly journals, and court cases. Doing, whose own Ph.D. is in philosophy, says the emerging technologies “really have brought the old principles to the forefront.”

Georgia Institute of Technology:

Undergraduates take a one-semester ethics course focusing on the effects of robotics and related technology on society. Professor Ronald Arkin says the idea is to get students up to speed on developments in the emerging technologies, provide them with a background in traditional ethics and philosophy, guide them through the new ethical dilemmas, and teach them how to write and speak effectively so they’ll be able to communicate their ideas and concerns. Arkin formally revises the course every two years and updates it continually from research papers, scientific articles, and his own observations as a widely known researcher. “This is not a course that remains static,” he says. “New issues are constantly cropping up.”

Texas A&M University:

All engineering students must take Engineering Ethics, a one-semester, large-class course that deals primarily with traditional professional ethics and standards and has recently begun covering “aspirational” ethics — the use of engineering to help improve society through green technology, environmental sustainability, and the like. Professor Ed Harris says the school “frankly has not done that much” to cope with the emerging technologies because “we may not be totally convinced yet that they really do introduce new ethical issues” rather than just “raising them in a new form.”

University of Oklahoma:

Graduate students take a two-day class in how to think about ethical issues, a broad class that includes non-engineers and even art students, and then are pressed to confront specific ethical questions in their regular engineering classes and projects. Professor Michael Mumford, a specialist in industrial and organizational psychology, says the major goal is to teach students how to “think downstream. If you learn to do that, you’re going to have fewer issues” to contend with.

University of Illinois at Urbana-Champaign:

Undergraduate students take a formal engineering ethics course with an added section on emerging technologies. The new section has meant condensing or replacing some issues that used to be covered. In addition to this course, some standard engineering courses include a one-week component that deals specifically with ethics. Professors Colleen Murphy and Paolo Gardoni are developing a new anthology on engineering ethics that focuses specifically on new technologies.

University of Virginia:

The school has a multiyear ethics requirement for engineering students. During the first year, everyone must take a large-class introductory course that focuses on emerging technologies. In their second or third years, students must take one elective course that touches on ethics along with other engineering-related topics. And finally, in their senior year, they must take two ethics courses and write a thesis on ethical or social policies related to their major discipline. Prof. Deborah Johnson says the first-year course has changed significantly over the past five years, but mostly in the way it’s presented rather than the strategy or the curriculum. There’s more emphasis on teamwork and hands-on learning, with simulation technology, online discussions, and social media.

 

Art Pine, a Washington, D.C. writer, covered national security affairs for several major newspapers.

ILLUSTRATION BY DENNIS P. CUMMINGS & ISTOCK/LEONTURA

Category: Features