Supercomputer creates over 700,000 years of simulated earthquakes

Source(s): Temblor
Upload your content

By Lauren Milideo

Researchers cannot foretell exactly when an earthquake will hit, but new research that harnesses the power of supercomputers accounts for the specific characteristics of the region’s faults, helping seismologists to better understand what hazards might exist in Southern California.

Rare events are hard to forecast

Large earthquakes are infrequent, and we simply haven’t seen such quakes on most California faults, says Kevin Milner, a computer scientist at the Southern California Earthquake Center and lead author on the new study. The fact that most faults in California have not hosted a large damaging earthquake since modern records have been kept, says Milner, leaves researchers “to infer what types of earthquakes we think are possible on those faults.” This uncertainty creates challenges for hazard assessment and planning.
 

Map of California with red lines depicting fault locations.
California’s faults have been extensively mapped.

Traditional hazard assessment is empirically based, Milner says. This means that what scientists know about earthquakes comes from what can be observed and extrapolated from data from past events. But, Milner says, empirical models rely on data from seismically active zones around the world. They aren’t location specific and may therefore overestimate or underestimate an area’s hazard due to variables specific to its faults and geology. The researchers note some past studies used combinations of empirical and physics-based models — those that instead rely on an understanding of physical processes — and consider both region-specific information and general data. Milner and colleagues took a new approach, he says: they used solely physics-based methods throughout their model.

Supercomputers

These calculations required tremendous computing power, and the team turned to two of the world’s largest supercomputers to get them done. The first step — creating 714,516 years of simulated earthquakes — took about four days to run on over 3,500 processors within Frontera, at Texas Advanced Computing Center, says Milner. The second step — simulating the ground motions resulting from all those earthquakes — ran on Summit, located at the Department of Energy’s Oakridge National Laboratory, and took a similar amount of time, Milner says.

The researchers did not reach specific conclusions regarding changes to hazard plans, Milner says, citing the need for further research. The study does show that, using a physics-based approach, not only can researchers create simulated quakes, but they can use these quakes to model the associated ground motions that inform hazard planning. Their results are consistent with empirical methods, suggesting that the new model is yielding valid results, Milner says.

“The fact that we can actually even be speaking to the ground motions now is a whole new terrain to be playing on, and it’s pretty exciting,” says study coauthor Bruce Shaw, an earthquake scientist at Columbia University.

The study’s novelty lies in part in bringing together “two methods that were previously not combined,” says Alice Gabriel, professor of earthquake physics and geophysics at Munich University, who was not involved with the research.

The team is “doing something really on the furthest edge that not just they, but we, can go to, as a computational seismology community,” says postdoc Marta Pienkowska of ETH Zurich’s Department of Earth Sciences, who was not involved in the research.

An important step

The research team acknowledges that far more work is needed before this research can begin informing or changing hazard assessment. “This was an important step, a proof of concept showing that this type of model can work [and that it] can produce ground motions that are consistent with our best empirical models,” says Milner, “and now it’s time to really dig in and vet it and build in more of our uncertainties.” These uncertainties include fault geometries, which are not well-defined far below the surface, says Milner.

Comparing the ground-motion results from physics-based and empirical models allows scientists to see where hazard estimates might need to change, to accommodate either more or less potential hazard at various locations, says Shaw. “It’s a tool to start exploring these questions in a way that can help us be more efficient in how we use our finite precious resources,” he says.

The research shows “that such large-scale modelling could contribute to seismic hazard assessment,” says Pienkowska.

Shaw says this research may be useful in other places like New Zealand, where a shallow subduction zone affects surrounding faults – a situation not reflected in the current array of empirically based ground motion models, and therefore perhaps not accurately predicted by them. Well-studied earthquake-prone regions such as Italy and Iceland might also benefit from this type of physics-based seismic modeling, as would developing countries and other locations where data are lacking and current empirical models may not apply very well, says Gabriel.

“It’s really cool to see geoscientists … use these big machines to advance earthquake preparedness,” says Gabriel.

Explore further

Hazards Earthquake
Country and region United States of America

Please note: Content is displayed as last posted by a PreventionWeb community member or editor. The views expressed therein are not necessarily those of UNDRR, PreventionWeb, or its sponsors. See our terms of use

Is this page useful?

Yes No
Report an issue on this page

Thank you. If you have 2 minutes, we would benefit from additional feedback (link opens in a new window).