When Dennis McLeod thought of the computing power needed
to study earthquakesand perhaps some day being able to predict
their occurrencehe thought of classical music. Not in the sense
that a symphony orchestra's crescendos might replicate the rolling
power of the fault lines rubbing against each other, or even that classical
music might have some hidden link to the ebb and flow of plate tectonics.
No, McLeod was thinking in terms of data, tons of data, and how to
manage that data in real time.
McLeod, a professor of computer science at the University
of Southern California (USC), is part of a research team called QuakeSim.
A joint project between NASA and researchers at six universities, QuakeSim
is aiming to create computer models that may unlock the mysteries of
where earthquakes might occur and when. As part of the project, QuakeSim
researchers are pushing the envelope of supercomputing by connecting
many computers in complex grid networks.
McLeod needed to replicate the meta-data that would have
to be processed in the QuakeSim project, so he came up with a plan
to place a symphony conductor in Boston, an orchestra in Miami, and
an audience in Phoenix. The experiment last spring sought to seamlessly
use network grid systems among computers to transmit the audio and
video data at high bandwidth among the three cities to make it appear
they were all in the same room, in real time.
It was interesting, because we even calculated
that the speed of light in sending the data across the country was
similar in time to the conductor hearing the sounds of the violin section
across the room, says McLeod, who studied computer science and
electrical engineering at the Massachusetts Institute of Technology. How
this relates to the earthquake research is that we need software systems
and grid computing systems that can manage huge amounts of data. There
will be ground sensors and satellite imaging systems that will require
sophisticated models and supercomputing. One of the keys in our research
is linking the supercomputing through grids.
The QuakeSim project hopes to use supercomputing and
better data management to create models to more fully understand the
movement of the Earth's plates. The example is weather prediction,
where use of radar systems, satellite technology, and computer models
have allowed weather forecasters to develop much better accuracy in
the past decade. It's not like the weather forecasters suddenly
became so much smarter overnight, says USC's McLeod. What
happened with weather predicting is that they were able to use more
data and the processing of that data to greatly improve their prediction
The problem with earthquake prediction is that
we have never been able to accurately measure what is happening below
the ground, says McLeod. But technology is allowing us
to mine more data, and it is becoming richer and more accurate. My
guess is that we may never be able to pinpoint an earthquake happening
in the next three minutes on a certain fault line, but we may be able
to pinpoint a region and predict areas that may be hazardous in the
next three months.
The QuakeSim team will use ground- and space-based data
collection points that will measure movements happening in distances
measured in millimeters on time scales of minutes to thousands of years.
The study of the data will, they hope, yield relationships between
fault lines, the influence of quiet plate motions that
do not result in earthquakes, and the relationships between earthquakes
on a historical basis. The team hopes to determine whether an earthquake
on one fault line has a future effect on another fault line.
Much of the work will be hind-based research,
meaning the researchers will collect data on earthquakes, and then
go back in time to see if any movements occurred with regularity at
certain intervals before the earthquakes started. Any commonalties,
however minute, might prove to be the key to predicting earthquakes
with some accuracy.
Our goals are to be able to create models that
study faults in terms of meters and not kilometers, says Andrea
Donnellan, QuakeSim principal investigator, and deputy manager of the
Earth science division for NASA's Jet Propulsion Laboratory in
Pasadena, Ca. Trying to understand the strain that builds up,
how faults failall of this is very subtle data that needs sophisticated
simulation models and high-performance supercomputers.
USC's McLeod estimates that the team needs to manage
data that will be coming in at a gigabit per second. In recent tests,
the QuakeSim team has been able to model earthquake faults by calculating
about 15,000 finite elements through 500 time steps in eight hours
of real time. By June of 2004, the team hopes to be able to calculate
400,000 elements through 50,000 time steps in 40 hours.
One of the keys to the computing research being done
by the QuakeSim team is the availability of the unified code that is
being developed to run on everything from a supercomputer to a Macintosh.
That is important for researchers and educators, because if you
want to work out a simple model just to build up your intuition about
something, you want to do that on a local machinea supercomputer
would be overkill, says Donnellan. The QuakeSim software will
be available on the Web by the fall of 2004. Users will be able to
access it at http://quakesim.jpl.nasa.gov/.
The availability of the software to researchers around
the country is going to have a great educational component, says John
Rundle, an interdisciplinary professor of physics, civil and environmental
engineering, and geology, and director of the Center for Computational
Science and Engineering at the University of California-Davis.
There will be some models created that will be
major advances in the state of the art, and we will understand to a
much better degree, new ways of Web computing, Rundle says. There
will be numerous Web-based educational opportunities for studying the
interconnectivity of the grid of thousands of computers. My view is
that we are going to eventually be able to use this technology to really
advance the prospects of predicting earthquakes.
Another computer scientist working on the project, Indiana University professor
Geoffrey Fox, who heads the school's Community Grids Lab, agrees that
the QuakeSim research is going to yield great education dividends: We
are building the software infrastructure that will allow us new ways to manage
meta-data, he says.
But whether the ability to predict earthquakesaided
by gigabits of data and the latest supercomputing poweris even
possible is up for debate. So far, prediction has been a complete
failure, says Les Youd, a Brigham Young University professor
of civil engineering studying ways to safeguard buildings against earthquake
aftereffects. We just haven't found the key yet. Can they
find it? It is going to be very difficult. Sometimes there is seismic
activity before a quake, and sometimes there isn't.
And the problem is what to do with such predictive information. If
you are able to say that an earthquake might occur over 10-mile area
during the next three months, what should the people who live there
do? Youd asks. People can't move out. And they can't
do major construction projects to adequately reinforce a building against
a quake. My guess is that people might put grandma's china away
and maybe bolt the antique grandfather clock down.
Youd also points out that earthquake prediction had better
become a very exact science before any group tries to make their prediction
public. He says that some people may have trouble selling their houses
if they are in a zone designated as being at high risk, and insurance
rates may become overly expensive for those property owners. There
are economic effects that have to be studied, Youd says.
One of the economic effects that is undisputed is the
amount of damage earthquakes cause. The Federal Emergency Management
Agency (FEMA) estimates earthquake damage to property at more than
$4 billion a year in the United States alone. Earthquake experts agree
that a reliable predicting process would help lower the property damage
John LeBrecque, Manager of the Solid Earth and Natural
Hazards Section in NASA's Office of Earth Science, thinks the
time is right to invest in the funding of research in earthquake prediction.
LeBrecque says NASA is currently deciding whether to launch a dedicated
satellite to study earthquake activity in conjunction with the QuakeSim
research, and the $500 million project is my program's highest
priority. But LeBrecque admits that the funding for such a satellite
is in limbo as Congress decides how to rework NASA funding in the wake
of the Space Shuttle Columbia disaster.
I think we have to come to the realization that
plate tectonics is the weather of the solid Earth, says LeBrecque. We
can currently measure continental motion to within 100 microns a year.
But I know if we invested an equivalent amount of money in earthquake
research as we have in weather prediction, we would be much further
down the road than we are today.
And for USC's McLeod, the QuakeSim project can be
seen on different levels. On the one hand he says, In 10 years
time, we will probably be able to tell people they will be hit by a
major earthquake within a certain time frame. On the other hand,
even if the predicting aspect fails to pan out, the research into computing
will have lasting effects for computer researchers and educators.
We are going to push the limits of what can be
done with grid networks and supercomputing abilities, McLeod
says. We are going to be able to deal with multiple sources
of information and make some sense out of it and we are going to
progress in everything from processing sensor information to using
Web pages. It will take human intelligence to make sense of this
data, but we hope we can use this research to help crisis management
as they deal with the prospect of a major earthquake.
Dan McGraw is a freelance writer based in Fort Worth, Texas.
He can be reached at email@example.com.