Study Uses Supercomputers to Advance Dynamic Earthquake Rupture Models

SDSC’s Comet Supports UC Riverside Study of San Andreas Fault System


Newswise — Multi-fault earthquakes can span fault systems of tens to hundreds of kilometers, with ruptures propagating from one segment to another. During the last decade, seismologists have observed several cases of this complicated type of earthquake rupture, and are now relying on supercomputers to provide detailed models to better understand the fundamental physical processes that take place during these events, which can have far reaching effects.

“The main findings of our work concern the dynamic interactions of a postulated network of faults in the Brawley seismic zone in Southern California,” said Christodoulos Kyriakopoulos, a research geophysicist at the University of California Riverside, and lead author of a study published in the Journal of Geophysical Research: Solid Earth.

While the study provides seismologists and geologists with a new understanding of a complex set of faults in the region that has the potential to impact the lives of millions of people in the U.S. and Mexico, some of the findings point to the possibility of a multi-fault earthquake in Southern California, which could have dire consequences.

“Under the current parametrization and the current model assumptions, we found that a rupture on the Southern San Andreas Fault could propagate south of Bombay Beach, which is considered to be the southern end of the southern San Andreas Fault,” said Kyriakopoulos. “In this case, if a rupture actually propagates south of Bombay Beach, it could conceivably sever Interstate 8, which is considered to be a lifeline between eastern and western California, in the case of a large event.”

Kyriakopoulos said the study also found that a medium-sized earthquake nucleating on one of the cross faults (smaller faults intersecting the San Andreas Fault in the Brawley Seismic Zone) could actually trigger a major event on the San Andreas Fault, noting that this will be the topic of ongoing work.

Seismic-sized Supercomputer Simulations

A dynamic rupture model allows scientists to study the physics that occur during an earthquake. Supercomputers are able to simulate the interactions between different earthquake faults. For example, the models let researchers study how seismic waves travel from one fault to another, and influence the stability of other faults. Such models are useful to investigate large-scale earthquakes of the past, and perhaps more importantly, possible earthquake scenarios of the future, according to Kyriakopoulos.

Kyriakopoulos and his collaborators used physics-based dynamic rupture models that allowed them to simulate complex earthquake ruptures using supercomputers. “We were able to run dozens of numerical simulations, and documented a large number of interactions that we analyzed using advanced visualization software,” explained Kyriakopoulos.

The numerical model developed by Kyriakopoulos and his colleagues consists of two main components.  First is a finite element mesh that implements the complex network of faults in the Brawley seismic zone. “We can think of that as a discretized domain, or a discretized numerical world that becomes the base for our simulations,” he said.

“The second is a finite element dynamic rupture code known as FaultMod (Barall et al 2009) that allows us to simulate the evolution of earthquake ruptures, seismic waves, and ground motion with time,” Kyriakopoulos said. “We can study their properties by varying the parameters of the simulated earthquakes. Basically, we generate a virtual world where we create different types of earthquakes. That helps us understand how earthquakes in the real world are happening.”

Kyriakopoulos and his collaborators faced three main challenges in generating supercomputer simulations of realistic earthquakes. “The first challenge was the implementation of these faults in the finite element domain, in the numerical model. In particular, this system of faults consists of an interconnected network of larger and smaller segments that intersect each other at different angles. It’s a very complicated problem,” he said.

The second challenge was to run dozens of large computational simulations and investigate as much as possible a large part of the parameter space, while the third challenge was to use optimal tools to properly visualize the 3-D simulation results, which in their raw form consist simply of huge arrays of numbers. did that by generating photorealistic rupture simulations using the freely available software called ParaView.

To overcome these challenges, Kyriakopoulos and his team used SDSC’s petascale Comet supercomputer, as well as the Stampede system at the Texas Advanced Computing Center at the University of Texas at Austin. Those systems are funded by the National Science Foundation.

In looking at the larger scientific context, Kyriakopoulos said this research has contributed toward a better understanding of multi-fault ruptures, which could lead to better assessments of the earthquake hazards. “If we know how faults interact during earthquake ruptures, we can be better prepared for future large earthquakes – in particular how several fault segments could interact during an earthquake to enhance or interrupt major ruptures,” he said.

Funding for this research was provided by the U.S. Geological Survey (G12AC20038), the NSF (EAR‐1033462, EAR‐1114446, EAR‐1135455), and the Southern California Earthquake Center (8889). Study co-authors are Christodoulos Kyriakopoulos and David Oglesby of UC Riverside; Thomas Rockwell of San Diego State University; Aron Meltzner of Nanyang Technological University, Singapore; Michael Barall of Invisible Software, Inc.; John Fletcher of the Centro de Investigacion Cientifica y de Educacion Superior de Ensenada (Mexico); and Drew Tulanowski of Rutgers University.

About SDSC

As an Organized Research Unit of UC San Diego, SDSC is considered a leader in data-intensive computing and cyberinfrastructure, providing resources, services, and expertise to the national research community, including industry and academia. Cyberinfrastructure refers to an accessible, integrated network of computer-based resources and expertise, focused on accelerating scientific inquiry and discovery. SDSC supports hundreds of multidisciplinary programs spanning a wide variety of domains, from earth sciences and biology to astrophysics, bioinformatics, and health IT. SDSC’s petascale Comet supercomputer is a key resource within the National Science Foundation’s XSEDE (eXtreme Science and Engineering Discovery Environment) program.

  • share-facebook-Study Uses Supercomputers to Advance Dynamic Earthquake Rupture Models
  • share-twitter-Study Uses Supercomputers to Advance Dynamic Earthquake Rupture Models

Comment/Share

Chat now!