Apply by March 1 for an opportunity to learn the tools and techniques needed to carry out research on the world’s most powerful supercomputers.

The architecture and software environments of today’s most powerful supercomputers are complex, posing significant challenges to researchers interested in using them to advance scientific discoveries. To meet these challenges and facilitate breakthrough science and engineering on these amazing resources, the annual Argonne Training Program on Extreme-Scale Computing (ATPESC) — hosted by the U.S. Department of Energy’s (DOE) Argonne National Laboratory — provides specialized, in-depth training to doctoral students, postdocs and computational scientists.

Applications are now being accepted for ATPESC 2021 — an intensive two-week training program that teaches participants the key skills, approaches and tools they will need to design, implement and execute computational science and engineering (CSE) applications on current leadership-class supercomputers and next-generation exascale machines. Launched in 2013ATPESC has hosted more than 500 participants since its inception.

“ATPESC provides an intense, broad and deep introduction to many aspects of high-performance computing, not just one or two. Having direct access to some of the leaders in the field is great for asking questions, getting opinions and talking about careers. The participants also benefit from being part of the ATPESC community with their fellow students.” — William Gropp, ATPESC lecturer

The program, which will be held this year from Aug. 1-13, is designed to address gaps in the training that computational scientists typically receive through formal education or other shorter courses.

Many researchers start out with fairly ad hoc exposure to solving problems computationally. The ATPESC curriculum covers a broad set of topics, filling in gaps and providing a solid basis to support their future work,” said Ray Loy, ATPESC program director and lead for training, debugging and math libraries at the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science User Facility.

Added ATPESC lecturer William Gropp, director and chief scientist at the University of Illinois at Urbana Champaign’s National Center for Supercomputing Applications, ​ATPESC provides an intense, broad and deep introduction to many aspects of high-performance computing (HPC), not just one or two. Having direct access to some of the leaders in the field is great for asking questions, getting opinions and talking about careers. The participants also benefit from being part of the ATPESC community with their fellow students.”

More than 70 participants will benefit from lectures and hands-on training sessions provided by renowned computer scientists and HPC experts from U.S. national laboratories, universities and industry. The core curriculum will address:

  • Computer architectures and predicted evolution.
  • Numerical algorithms and mathematical software.
  • Approaches to building community codes for HPC systems.
  • Data analysis, visualization, I/O, and methodologies and tools for big data applications.
  • Performance measurement and debugging tools.
  • Machine learning and data science.

Previous attendees have benefited immensely from the program. Aleksandra Pachalieva, a graduate research assistant at Los Alamos National Laboratory who attended ATPESC in 2020, remarked ​“Aside from technical aspects of exascale computing, I learned a lot about software productivity, quality and sustainability. The systems and tools that we are using are changing constantly but having good software practices is a key concept that will ensure the success of the software we develop as scientists.”

Another 2020 participant, Kevin Green, a research scientist for the department of computer science at the University of Saskatchewan in Canada, said ​I’ll be using a lot of the ideas I’ve seen here to update material in our high-performance computing courses. I’ve also gotten a good feel for how we can migrate our current code designs to designs that will perform well across different supercomputing architectures.”

How to Apply

The call for applications is now open, and interested doctoral students, postdocs and computational scientists are encouraged to apply here by March 1.

Qualified applicants will have

  • Substantial experience in MPI and/or OpenMP programming.
  • Experience in applying at least one HPC system to address a complex problem.
  • Plans to conduct CSE research on large-scale computers.

There are no fees to participate, and domestic airfare, meals and lodging are provided to selected applicants. The event is currently planned to be in-person, as usual; if in-person meeting restrictions prevent that, ATPESC will be virtual.

ATPESC is organized by the ALCF and funded by the Exascale Computing Project (ECP), a collaborative effort of the DOE Office of Science’s Advanced Scientific Computing Research Program and the National Nuclear Security Administration. The training program is structured to align with the ECP’s mission to develop a capable computing ecosystem for future exascale supercomputers, including Aurora at Argonne and Frontier at Oak Ridge National Laboratory.

Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://​ener​gy​.gov/​s​c​ience.