SCEC’s ‘M8’ earthquake simulation breaks computational records, promises better quake models

A multi-disciplinary team of researchers has presented the world’s most advanced earthquake shaking simulation at the Supercomputing 2010 (SC10) conference held this week in New Orleans. The research was selected as a finalist for the Gordon Bell prize, awarded at the annual conference for outstanding achievement in high-performance computing applications.

The “M8” simulation represents how a magnitude 8.0 earthquake on the southern San Andreas Fault will shake a larger area, in greater detail, than previously possible. Perhaps most importantly, the development of the M8 simulation advances the state-of-the-art in terms of the speed and efficiency at which such calculations can be performed.

The Southern California Earthquake Center (SCEC) at the University of Southern California (USC) was the lead coordinator in the project. San Diego Supercomputer Center (SDSC) researchers provided the high-performance computing and scientific visualization expertise for the simulation. Scientific details of the earthquake were developed by scientists at San Diego State University (SDSU). Ohio State University (OSU) researchers were also part of the collaborative effort to improve the efficiency of the software involved.
While this specific earthquake has a low probability of occurrence, the improvements in technology required to produce this simulation will now allow scientists to simulate other more likely earthquakes scenarios in much less time than previously required. Because such simulations are the most important and widespread applications of high performance computing for seismic hazard estimation currently in use, the SCEC team has been focused on optimizing the technologies and codes needed to create them.
The M8 simulation was funded through a number of National Science Foundation (NSF) grants and it was performed using supercomputer resources including NSF’s Kraken supercomputer at National Institute for Computational Science (NICS) and the Department of Energy (DOE) Jaguar supercomputer at the National Center for Computational Science . The SCEC M8 simulation represents the latest in earthquake science and in computations at the petascale level, which refers to supercomputers capable of more than one quadrillion floating point operations (calculations) per second.
Petascale simulations such as this one are needed to understand the rupture and wave dynamics of the largest earthquakes, at shaking frequencies required to engineer safe structures,” said Thomas Jordan, director of SCEC and Principal Investigator for the project. Previous simulations were useful only for modeling how tall structures will behave in earthquakes, but the new simulation can be used to understand how a broader range of buildings will respond.
“The scientific results of this massive simulation are very interesting, and its level of detail has allowed us to observe things that we were not able to see in the past,” said Kim Olsen, professor of geological sciences at SDSU, and lead seismologist of the study. .
However, given the massive number of calculations required, only the most advanced supercomputers are capable of producing such simulations in a reasonable time period. “This M8 simulation represents a milestone calculation, a breakthrough in seismology both in terms of computational size and scalability,” said Yifeng Cui, a computational scientist at SDSC. “It’s also the largest and most detailed simulation of a major earthquake ever performed in terms of floating point operations, and opens up new territory for earthquake science and engineering with the goal of reducing the potential for loss of life and property.”
Specifically, the M8 simulation is the largest in terms duration of the shaking modeled (six minutes) and the geographical area covered – a rectangular volume approximately 500 miles (810km) long by 250 miles (405 km) wide, by 50 miles (85km) deep. The team’s latest research also set a new record in the number of computer processor cores used, with 223,074 cores sustaining performance of 220 trillion calculations per second for 24 hours on the Jaguar Cray XT5 supercomputer at the Oak Ridge National Laboratory (ORNL) in Tennessee.
We have come a long way in just six years, doubling the seismic frequencies modeled by our simulations every two to three years, from 0.5 Hertz (or cycles per second) in the TeraShake simulations, to 1.0 Hertz in the ShakeOut simulations, and now to 2.0 Hertz in this latest project,” said Phil Maechling, SCEC’s associate director for Information Technology.
In terms of earthquake science, these simulations can be used to study issues of how earthquake waves travel through structures in the earth’s crust and to improve three-dimensional models of such structures.
Based on our calculations, we are finding that deep sedimentary basins, such as those in the Los Angeles area, are getting larger shaking than are predicted by the standard methods,” Jordan said. “By improving the predictions, making them more realistic, we can help engineers make new buildings safer.” The simulations are also useful in developing better seismic hazard policies and for improving scenarios used in emergency planning.
Note: This story has been adapted from a news release issued by the University of Southern California