Learn about diversity at ASEE
ASEE would like to acknowledge the generous support of our premier corporate partners.


+ By Thomas K. Grose
Engines of Exploration - Putting the BIG in Big Science.

Abingdon, England

Inside the control room of the Joint European Torus (JET) at Britain’s Culham Center for Fusion Energy (CCFE), a clutch of physicists and engineers gaze with rapt attention at a triptych of wall monitors. On the screens appears a ghostly, red apparition dancing in the dark like the Northern Lights. It’s plasma: a light, superheated gas created when a mixture of deuterium and tritium -- two hydrogen isotopes -- is puffed inside JET’s doughnut-shape containment vessel (see cover), then zapped with a high-powered pulse of electricity. For a few seconds, temperatures reach tens of millions of degrees – “the hottest place in the universe,” says Nick Balshaw, a JET diagnostic engineering group leader.

JET is the proof of concept that energy can be harnessed from the fusion of nuclei. Already, its superhot plasmas have created up to 16 megawatts of power, hastening the day, it is hoped, when fusion power can provide the world with a clean, safe, and pretty much unlimited supply of energy. Yet JET still requires slightly more power to heat the gas than it produces, because of heat loss. So scientists and engineers are taking knowledge largely gleaned from JET to the next stage of the fusion experiment: the $19 billion International Thermonuclear Experimental Reactor (ITER) in the south of France, due for completion in 2018. Twice JET’s size, ITER is intended to produce many times more power – 500 megawatts’ worth – dwarfing the amount needed to produce plasma. “Size matters for fusion,” says Ken Blackler, who is in charge of construction and operations at ITER. “It’s easier with a bigger machine” because more volume reduces heat loss.

Plasma physics projects like JET and ITER, along with particle physics experiments conducted at the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN) in Switzerland — the world’s largest particle accelerator -- as well as the breathtaking astronomical observations made with the Hubble Space Telescope, are all high-profile examples of Big Science. These are massive scientific investigations at the national or international level requiring teams of hundreds of researchers, and budgets that can easily run into billions of dollars. And none of them would be possible without strong, working alliances of scientists and engineers -- alliances so close and collaborative that the traditional boundary between the two domains blurs. “The distinctions between science and engineering are totally redundant in big technology science projects,” asserts Gerhard John Krige, a science and technology historian at Georgia Tech.

Extracting bitumen from oil sands - This is the method for deposits close to the surface. Deep deposits require drilling and Steam Assisted Gravity Drainage.
WHAT: Hubble Space Telescope - WHERE: Positioned above the atmosphere - WHEN: Launched in 1990 - TECHNOLOGY: Offers a better view of the universe than offered by ground telescopes by removing distortion caused by shifting air pockets. - RESULTS SO FAR: Fixed the age of the universe at 13-14 billion years; played a key role in discovery of dark energy; revealed galaxies in all stages of evolution; sent back thousands of images; informed 6,000-plus scientific articles.

Historically, the dividing line between the two fields was simple and clear. As Theodore von Karman, the late Hungarian-American engineer and physicist, once succinctly summed it up, “Scientists study the world as it is; engineers create the world that never has been.” But a disintegration of that dividing line began in earnest during World War II with the huge government-financed research and development effort that created the atomic bomb. That was the beginning of Big Science. And as experiments scaled up, physicists learned that they needed to work with and understand the engineers who would design their mansion-size machines and incredibly precise instruments. The 1984 Nobel Prize in physics consummated the collaboration, going both to Simon van der Meer, a Dutch engineer, and Carlo Rubbia, an Italian physicist, for their discovery at CERN of the W and Z particles. “Engineers (have) utterly transformed physics,” says Peter Galison, a Harvard University science historian, by making possible experiments and revelations once considered out of reach. Ken Blackler agrees. At ITER, he says, “physics is driving the engineering, and engineering drives the physics.” Electrical engineers are particularly important to big physics projects, but nearly all the engineering disciplines -- from mechanical to cryogenic to civil -- have roles to play.

Sometimes a single individual personifies the engineering-science partnership. One is physicist Jerry Nelson of the University of California, Santa Cruz, who came up with the idea of constructing a large telescope mirror using panels that stay in sync with one another. Even so, says Galison, also a physicist, “there’s still a big distinction between string theory and the building of a bridge.” Typically, physicists still tend to dream up the experiments, but they largely do so by working closely with engineers to know what’s doable. “Otherwise,” Krige says, “it’s just dreaming.”

The importance of engineering to Big Science is obvious in the equipment and instruments here at JET. While the core technology – called tokamak – was the invention of celebrated Soviet physicist Lev Artsimovich, the 3-meter-thick walls to contain the intense heat, the gigantic electromagnets that can shape plasma “like a big blob of jelly,” in Balshaw’s words, and the 4,000 metal tiles inside the vessel all spell engineering. In addition, Balshaw says, “the design of plasma experiments is not just left to the physicists.” JET’s chief engineer routinely “performs experiments to check the forces inside the vessel.”

Beyond showing what’s doable, engineers enable science to push the frontiers of discovery, each revelation opening new avenues for inquiry. Few areas of Big Science depend on engineering more than astrophysics. And few projects show the engineering-astrophysics collaboration more clearly than the $2 billion Hubble Space Telescope, still capable of stunning discoveries 22 years after its April 1990 launch. Everything about the Hubble involved “unprecedented” engineering, says its project scientist, Bob O’Dell, 75, an observational astrophysicist at Vanderbilt University. Its truss, which housed five scientific instruments, was made from a then revolutionary graphite epoxy. “It was a metallurgical challenge,” because it had to withstand wild fluctuations in temperature, with one side heated by the sun, the other facing deep, cold space. “It was imperative that the engineers and scientists work closely,” O’Dell says, and “there was continuous give-and-take” between what the scientists wanted and what the engineers could deliver.

Hubble initially looked like a failure, because its main mirror was made to the wrong prescription and couldn’t focus light received from distant objects. The flaw wasn’t discovered until it was in orbit. But Hubble was designed to be reachable for upgrades and fixes via the space shuttle. So in December 1993, a team of astronauts was able to fit the telescope with two devices that acted like a pair of glasses and corrected the myopic mirror’s vision. Over the years, Hubble was given four other upgrades by astronauts, which have extended its working life to the present time. “The planning for obsolescence with updates and repairs, that was truly unique with the space program. It had never been done before,” O’Dell says.

Game Changer

Thanks to that kind of engineering, Hubble became a star in its own right, making possible the direct detection of extra-solar system planets, discovery of the ubiquity of supermassive black holes, and what O’Dell calls “a game changer”: confirmation of the acceleration of the universe. In February, astronomers using Hubble found exoplanet GJ 1214b, which they say belongs in a new class of “water-world” planets. Hubble’s successor, the $6.2 billion James Webb Space Telescope, set for a 2018 launch, will have 17 times the light-gathering capacity of Hubble, allowing it to probe the farthest reaches of the universe and every phase of its history since soon after the Big Bang.

While the Webb rides its elliptical orbit a million miles out in space, huge Earth-bound telescopes – each an engineering marvel – will compete for astronomical thrills from high remote regions in Chile, far from light pollution and so dry as to be free of cloud cover most of the year. The $1 billion Atacama Large Millimeter/Submillimeter Array (ALMA), a project of the European Southern Observatory (ESO) and other global partners, will focus on where the first stars formed some 13 billion years ago at the dawn of the universe. Its 66 carbon-fiber radio antennas – each weighing 150 tons, yet movable to 192 different pads scattered around the site – will be linked by subterranean fiber-optics cables so that they work like one giant telescope with a precision of one millionth of a millionth of a second. Gathering light not visible to the eye, at the microwave level, it will “see” through the clouds of soot and dust in the universe that can obscure the vision of optical telescopes.

Brian Ellison, the U.K. project manager for ALMA, says that “in many areas we have pushed the boundaries of engineering technology,” aided by advances in fiber optics and high-speed computing. The sensors it will use to detect signals from space are new. A single detector would have required too much bandwidth, so it was decided to divide the chore into an array, which made the instrument workable but more complex.

In 2020, a second window on the universe will open on the Chilean plateau. The Giant Magellan Telescope is being built by a U.S.-Australian-South Korean consortium that includes the Carnegie Institution for Science, the University of Texas at Austin, Texas A&M University, the University of Arizona, the University of Chicago, and the Harvard-Smithsonian Center for Astrophysics. Distinctive for its seven 27-foot-wide mirrors, produced using a one-of-a-kind furnace at Arizona’s Seward Observatory Mirror Lab, the GMT boasts the ability to acquire images 10 times sharper than Hubble’s.

WHAT: Giant Magellan Telescope - WHERE: Atacama Desert, Chile - WHEN: Begins operating in 2020. - TECHNOLOGY: Will collect more light and have better resolution than previous telescopes. - PURPOSE: Search for life on other planets; gather data on how galaxies formed; probe the mysteries of dark matter, dark energy, and the fate of the universe.
WHAT: Giant Magellan Telescope - WHERE: Atacama Desert, Chile - WHEN: Begins operating in 2020. - TECHNOLOGY: Will collect more light and have better resolution than previous telescopes. - PURPOSE: Search for life on other planets; gather data on how galaxies formed; probe the mysteries of dark matter, dark energy, and the fate of the universe. IMAGE CREDIT: GMTO Corporation

Chile also will be home to the $1.4 billion European Extremely Large Telescope (E-ELT), the world’s largest optical telescope. Its 128-foot mirror will be fashioned from 800 hexagonal segments that constantly align themselves. “Basically,” says Colin Cunningham, the E-ELT U.K. project manager, “it’s an achievement of metrology and computer control, and it’s also a major engineering achievement to manufacture it. A few years ago, it would take you two years to make each mirror.” Thanks, however, to recent engineering breakthroughs, 1,000 segments – including 200 backups – will be made within three to four years.

E-ELT will use adaptive optics on an unprecedented scale to correct the blurring of images by the Earth’s atmosphere. Adaptive optics measures and models the distortions, then feeds the information to smaller mirrors, which cancel them out. “It’s a bit like a pair of specs with lenses that modulate 1,000 times a second,” Cunningham says. Scheduled for completion in 10 or 11 years, E-ELT will give astronomers views of the first bursts of light in the universe and search for Earth-like exoplanets, including those that have the potential for life.

While the mysteries of space challenge physicists and engineers, so do its dangers. Dubbed the “front line of Earth defense,” the Panoramic Survey Telescope & Rapid Response System (Pan-STARRS) at the Haleakal summit on Maui maps one sixth of the sky each month, scouting for asteroids that could slam into our world. Small in relation to the E-ELT, it nonetheless claims the world’s largest digital
camera — 1,400 megapixels – able to capture an area of the sky as big as 36 full moons in one exposure.

Like Giant Microscopes

As technology reveals an ever greater expanse of the cosmos, it is also probing the universe in its smallest possible form. Experiments at accelerators like CERN’s $5.5 billion LHC smash subatomic particles into one another to re-create in miniature the conditions existing just seconds after the Big Bang. It’s hoped that the LHC will eventually reveal proof – perhaps by this summer – of the elusive Higgs boson, the theoretical “God particle” that imparts mass to everything else. The collider’s particle beam has a maximum power output of 7 tera-electronvolts (TeVs) and races around a 17-mile underground loop. Its detectors are the Nobel Prize-winning invention of Georges Charpak, a Polish physicist who earned his bachelor’s degree in mining engineering.

David Wark, a British physicist on CERN’s advisory committee, likens the huge superconductor detectors to “giant microscopes” that can see at the smallest resolution ever, including particles that are decaying. “They are an absolute testament to engineering,” says Wark, who studies neutrinos at another accelerator, the J-PARC in Japan. “If it were not for engineering advances, there would be no LHC.” Not that there weren’t setbacks as well. Failure of one of its superconducting magnets brought the collider to a halt in September 2008. But Wark calls it amazing that engineers brought it back on line just 14 months later and have run it since without mishap. Still operating at only half power, 3.5 TeVs, it will shut down for 20 months so that all the interconnects can be replaced. Once that job’s done, it should be full speed ahead for the LHC.

High-performance computing has become an essential element of scientific discovery, and electrical, computer, and software engineers constantly stretch the limits of speed. Jack Dongarra, a professor of computer science at the University of Tennessee, Knoxville, notes that science was once done either on paper -- writing a theory -- or by building an experiment. “Today, we can augment those with simulations. We use high-performance computers to simulate physical phenomena that are too big, too expensive, or impossible to test otherwise” -- like modeling climate change, or the impact of two galaxies colliding.

Counting on Moore’s Law

The construction of CERN’s LHC was based mainly on simulated prototypes. “It’s a very complex machine, but it worked on Day 1,” Wark says, because of the efficacy of high-powered computer modeling. The Jaguar supercomputer at the Oak Ridge National Laboratory, which is expected to regain its place as the world’s fastest when an upgrade is completed next year, is helping with the design of the ITER reactor. And Cunningham says that the E-ELT is designed to use computing power that does not yet exist. “We’re counting on Moore’s Law to continue,” he says, referring to the 1965 observation by Intel cofounder Gordon Moore that chip performance doubles approximately every one to two years.

Mining shovels scoop up oil sands 100 tons at a time and load the material onto 240- to 380-ton trucks.
WHAT: Jaguar Supercomputer - WHERE: Oak Ridge National Laboratory, Tenn. - WHEN: Installed in 2005 - TECHNOLOGY: Uses a multiple-processor distributed memory system with a peak performance of 1,750 teraflops. PURPOSE: Advances research by government, universities, and industry into climate, renewable energy, fusion, astrophysics, seismology, and materials science.

Harvey Borovetz, head of the bioengineering department at the University of Pittsburgh, says high-performance information technology is key to almost all bioengineering research, from bioinfomatics, to future imaging devices that will detail the effects of pharmaceuticals at the nano level, to personalized medicines based on data from the human genome. The supercomputers used for modeling climate change can also be used to comb through medical data sets to prepare strategies for treating cancer patients, Borovetz says. “Because of computing power, medical problems that are intractable now won’t be in 50 years’ time.”

High-performance computing is also central to operations at JET. Each pulse -- and there are 20 to 25 each day JET is operating -- creates 40 gigabytes of data, requiring a massive data-storage system. Each maintenance and repair maneuver by robotic arms is first practiced using virtual reality programs.

While the dazzling prospect of abundant clean energy gives governments a rationale for backing JET and ITER even in tough fiscal times, areas of Big Science that offer discovery for its own sake are bumping up against austerity budgets in the United States and across Europe. Matthew Hourihan, who analyzes the federal research and development budget at the American Association for the Advancement of Science, notes that the White House’s 2013 budget would make big cuts to high-energy physics funding, and that’s before Congress has its say. Adds Harvard’s Galison: “It is tough sledding for science now, given budget constraints.” So the next challenge for Big Science engineers may be more prosaic than containing hotter-than-sun temperatures, but no less difficult: finding new ways to keep the money rolling in.


Thomas K. Grose is Prism’s chief correspondent, based in London.




© Copyright 2012
American Society for Engineering Education
1818 N Street, N.W., Suite 600
Washington, DC 20036-2479
Telephone: (202) 331-3500