There is only one way in. The lone door is always locked. The one window is secured. The ambitious thief is advised not to try to enter the room physically. Better, instead, to hack one's way in. Because inside the locked room are 30 computers, some of which contain the secrets of a fictitious company known as the 532 Corporation. It is a fair bet that somewhere inside the company, someone left an electronic door ajar, opening the entire enterprise to attack. Bet on this, too: 532 Corporation will be attacked because every semester at Iowa State University, professors Doug Jacobson and Jim Davis unleash 50 of their students, giving them three weeks to break into the firm, gather a crucial encrypted file, and leave without being detected. "What they're doing would be illegal if they left our little world," says Jacobson.
Already they are learning valuable lessons: that people are the weakest link in any system and that fancy technological fixes and Internet firewalls are of no use if employees choose passwords off the TV show Star Trek. Which, the professors testify, people all too often do. At project's end, students are asked to explain how they would fix what they learned to break. "We're producing the info warrior," says Jacobson. For the NSA, which cares deeply about information security, the program is essential.
Computer engineers at schools such as Iowa State are today's Minutemen, alerting and training a new generation of troops for a war the nation hopes never comes. The threat is unlike wars of the past, lacking any set battle lines or clear enemies. Never before has a challenge seemed so vividly real yet invisible. But at the Pentagon and throughout government, officials are imagining the unimaginable: Enemy states or terrorists not content to hack into Pentagon computers but eager instead to do real damage by attacking America's computer-dependent banking system, power stations, and transportation network. Or terrorists, allied with a rogue state, willing to unleash biological or chemical weapons against the American homeland.
These nontraditional threats have prompted a very traditional response. The federal government, just as it did in the Cold War and the Second World War—most dramatically with the Manhattan Project—wants scientists and engineers from industry and academia to find technical fixes that will blunt unconventional attacks. "We can prevail over terrorism by drawing on the very best in our free society: the skill and courage of our troops, the genius of our scientists and engineers, the strength of our factory workers, the determination and talents of our public servants, the vision of our leaders in every vital sector," President Clinton said last January, when he announced a series of new initiatives during a speech at the National Academy of Sciences.
The challenge is rich. And so is the budget to meet it. The president wants to spend $2.8 billion this year to defend against chemical, biological, and cyber threats, including $500 million for research on information security and $400 million for research to defend against chemical and biological attack. And dollars are by no means the only yardstick by which to measure the threat; one need only count the number of special commissions established to study these new dangers to realize how worried top officials are.
The degree to which the American economy depends on computers—and the degree to which different infrastructures depend on each other—was one of the key conclusions of the critical infrastructure report. This past year two of every three companies in an Information Week survey reported being victimized by a virus attack. That corporate America is a target—not just the Pentagon or government agencies—is one reason the NSA is eager to bless programs such as the one at Iowa State, which produces graduates who may work in the private sector but will bring some expertise that can protect business from intrusion.
Of course, government and military computers are targets, too. Despite the secretive nature of much military business, the Defense Department hardly operates in a locked room. In 1998, about 95 percent of Pentagon "telecommunications requirements" came from private networks, according to Jeffrey Hunker, the director of the Critical Infrastructure Assurance Office. The most serious breach so far was a scheme dubbed Solar Sunrise that spooked the Pentagon at the very moment the military was set to launch an air attack against Iraq in 1998. What was unique about that event was that while the perpetrators turned out to be two teenagers from California, they routed their attack through the United Arab Emirates—a reminder that terrorists or enemy states bent on disrupting American information systems can conceal their identities. Cyber terrorists won't be spotted leaving the scene in a getaway car. "What worries me are state actors," says Michael Jacobs, a senior official at the National Security Agency. "They will be more sophisticated and they have no interest in claiming victory."
There is another challenge, related but distinct from the well-publicized threats of chemical and biological terror or information attack. "Where I see the real problem is system complexity," says Jennifer Nelson, a specialist on critical infrastructure issues at Sandia National Laboratories. "Infrastructures are becoming so interdependent that outages in one can cascade over and have effects in others."
The Clinton administration's response to these new threats was formulated in 1998 when it issued two Presidential Decision Directives, or PDDs, that outline a plan of action for addressing the nation's vulnerability to computer and terror attack. The approach is wide-ranging: it calls for everything from the creation of a "cybercorps" of computer security experts to research on vaccines against biological agents to the creation of local medical teams ready to respond in a crisis.
But some critics believe the president has put too much emphasis on exotic scientific solutions and not enough on simple, common sense. "The quest for a silver bullet is misplaced," says Richard Falkenrath, a Harvard University professor of public policy and author of a book on biological and chemical weapons terrorism. "There are a lot of simpler, lower-tech steps that are not as glamorous. For me, the important stuff is organizational." For example, he argues that local police, fire, and medical units need more and more realistic training for possible chemical or biological weapons attack.
There remains, however, a remarkable range of promising research underway that has been generated in response to these new threats: from training computer engineers to prevent intrusion to training insects to detect chemicals. But for these areas of research and everything in between, there are some common threads: much of the engineering work being done today involves detecting threats (be they computer intrusions or biological attack) or minimizing the damage that a successful attack inflicts.
Sensors are the next big thing. Without a clear understanding of what exactly the threat is, heroic response is impossible. Scientists at Sandia National Laboratories are developing tiny acoustic-wave chemical sensors that they call a "chem lab on a chip." When certain chemicals are absorbed by the sensor, acoustic waves are slowed. The lab expects, initially, that the chips could be built into a handheld device that would permit soldiers or others to detect chemicals. Down the road, the system could be connected to an alarm that would warn of the presence of a chemical.
One sensor that certainly weighs less than five pounds is the wasp. Researchers at Iowa State University, under the auspices of DARPA, are training the insects to locate chemicals. Wasps are able to detect a wide range of chemical compounds and can be trained to locate specific chemical cues. The goal is to raise them in large numbers for use as sentinels to warn about possible chemical attack. There is a second, more ambitious use of the wasps, according to DARPA's Controlled Biological Systems program. The agency wants to develop a biological sensor system "using wasp antennae interfaced with electronics or mounted on an unmanned automated vehicle" to detect production or storage sites for toxic compounds.
Foiling a biological weapons attack doesn't just mean waiting for it to occur and then trying to minimize the damage. Programs funded by the Defense Department aim to find scientific techniques for detecting and attacking weapons facilities and labs where the agents are made. Experts on seismology, electromagnetics, and spectroscopy will be called upon to study the acoustic, seismic, and other signatures of underground weapons facilities. DARPA hopes research may also help military officials perform "battle damage assessments" after an attack to determine whether a hit crippled the facility or not.
For all the high-tech solutions to the problems posed by new threats, there are more simple engineering fixes that, in many ways, may prove more useful and practical. One reason is that they are better matched to the threats of the moment. For all the scenarios conceived about biological or chemical attack, the fact remains that it is a difficult technical challenge to turn an agent produced in a lab into a deadly weapon that can be delivered to a target. It remains far simpler, and often as lethal, simply to rely on conventional explosives. The death toll in the chemical attack on the Tokyo subway was 12. Bombings at Oklahoma City and at the two American embassies in Africa killed hundreds. For a host of reasons, including ease of acquisition and of delivery, conventional explosives remain the weapon of choice for terrorists such as Osama bin Laden.
Continuing research at Sandia will focus on analyzing the effects of bomb blasts on buildings, as well as studying the ways in which glass shatters. The work on blasts will piggyback on Sandia's long-standing relationship with the Defense Department; blast simulations that will determine the effect of explosives on barrier walls and large, multistory buildings will be conducted with super computers in association with the government. The work on glass aims to identify ways to reduce the risk to people, including the use of film coatings, tempering, and lamination. "These efforts are designed to provide the technical basis for glass protection design standards and guidelines for anti-terrorism," according to a technical paper by the program's leaders, Rudolph Matalucci and Dennis Miyoshi.
Ironically, perhaps, the research to make safer glass is modeled on computers, and yet those computers themselves are vulnerable to attack. Protecting information was once the province of the government. Now, because computer networks are decentralized, vulnerability anywhere can mean vulnerability everywhere. "The world has changed," says the NSA's Mike Jacobs. "We no longer own the technology. We alone cannot train enough people to do the job on the job." And that is why NSA, despite its historic reputation for secrecy, is reaching out to the business and university communities to assist their efforts to secure information systems.
Other efforts are shorter on symbolism and longer on substance. NSA is sharing case studies—the sort of information that not so long ago would have been illegal to disclose—with universities so they can model curricula on the latest information available to the government. There is also extensive research, much of which would go to engineering departments, on information assurance. Research fields include work on operating system technologies and ultra high-speed encryption.
And then there's research that raises the fair and obvious question: Why hasn't this been done before? The NSA endorses the use of formal methods for software development. Jacobs, who is the top official responsible for information assurance at the agency, observes that the next version of Windows NT will contain something more than 40 million lines of code. "How can anyone be sure of what's in there?" he asks. If the top security official at NSA can't, there's a problem waiting for a fix.
Bruce Auster is a freelance writer in Washington, D.C.