-- By Jessica Scully

Newswise — Researchers at Berkeley Lab are using high-performance computing systems to better predict how structures will respond to an earthquake along one of the Bay Area’s most dangerous faults. 

David McCallen is the principal investigator of a team of Berkeley Lab and Lawrence Livermore National Laboratory researchers working with these systems to model a 7.0-magnitude earthquake along the Hayward Fault for different rupture scenarios. A major focus of the project is being prepared to take full advantage of emerging Department of Energy exascale computing systems, expected to be available in 2022 or 2023. These systems will be capable of on the order of 50 times more scientific throughput than current High Performance Computing systems, allowing higher fidelity simulations and making modeling different scenarios dramatically quicker. The team’s goal is to learn how these different earthquake scenarios would impact structures throughout the San Francisco Bay Area region.

What is this new capability? 

Historically, the only way that scientists and engineers could try to predict future ground motions was to look to past earthquake records and then extrapolate those to the conditions at a structure. This approach doesn’t always work well because ground motions are very site-specific. This is the first time that we can use what we refer to as physics-based models to predict regional-scale ground motions and the variability of those ground motions. It’s really a three-step process where we model all the steps: rupture, propagation through the Earth, and interaction between the waves and the structure at the site.

How could the technology be used practically?

There’s a lot of uncertainty in predicting future earthquake motions and what particular facilities would be subjected to. And you really need to understand those motions because if you understand the input to the structure, you can then model the structural response and understand the potential for damage.

What have you achieved so far?

Because of computer limitations we could previously resolve ground motions to only about one or two hertz: ground motions that vibrate back and forth about one or two times per second. To do accurate engineering evaluations, we need to get all the way up to eight to 10 hertz. We’ve been able to do five- to 10- hertz simulations with the highest speed computers now, but those take a long time, like 20 to 30 hours.

What will exascale computing enable?

We’re looking forward to getting a lot of speed up with these new advanced machines so that we can resolve very high frequencies but do it in maybe three to five hours. We need this because we need a lot of simulations to account for the uncertainty and variability in earthquake parameters. An earthquake on the Hayward Fault is overdue. We don’t know precisely how the fault will rupture, so we have to look at different rupture scenarios to fully understand the potential risk. 

# # # 

Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 13 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy's Office of Science.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.