Newswise — This year’s week-long “Summer Institute” workshop held by the San Diego Supercomputer Center (SDSC) at the University of California San Diego will focus on a wide range of introductory-to-intermediate topics in high-performance computing (HPC) and data science for researchers in academia and industry, especially those in domains that have not traditionally used HPC resources.
This year’s workshop continues SDSC’s strategy of bringing high-performance computing to what is known as the ‘long tail’ of science, i.e. providing resources to a larger and more diverse number of modest-sized computational research projects that represent, in aggregate, a tremendous amount of scientific research and discovery. SDSC has developed and hosted Summer Institute workshops for well over a decade.
“This program is also intended to assist researchers who have science challenges that cannot typically be solved using local computing resources,” said Andrea Zonca, an SDSC senior computational scientist and director of the Summer Institute, as well as an instructor in several courses on computing frameworks used to process large amounts of data distributed across thousands of computer nodes.
Added Zonca: “The goal of this year’s program is to give attendees an overview of topics in high-performance computing and data science, while accelerating their learning process through interactive classes with hands-on tutorials on SDSC’s Comet supercomputer.”
Highlights of SDSC Summer Institute 2017 includes:• A user’s guide to Comet, including interacting with its job scheduler, understanding strengths and weaknesses of the available files systems, and using singularity containers to run another operating system such as Ubuntu.• Traditional supercomputing topics such as parallel programming with MPI/OpenMP, performance optimization, code profiling, GPU programming with CUDA, and scientific visualization.• Today’s data science topics such as machine learning, ‘big data’ processing with Spark, and parallel programming with Python.• Topics related to reproducibility, such as basic and advanced version control with git/GitHub and workflow management with Kepler.
Full details of this year’s program, including an agenda and instructor profiles, can be found here. Attendees will have several opportunities throughout the week to meet individually with SDSC HPC and data science experts to discuss the best techniques to solve their specific scientific challenges.
The application period is now open to attend SDSC Summer Institute 2017. In order to benefit from the classes, attendees are required to have familiarity with the UNIX/Linux shell. Basic programming skills (in any programming language) are strongly recommended. Applications will be accepted through Friday, May 5, 2017. They will be notified of status by Friday, May 19. Accepted applicants are asked to confirm attendance by registering no later than Friday, June 9.
About SDSC As an Organized Research Unit of UC San Diego, SDSC is considered a leader in data-intensive computing and cyberinfrastructure, providing resources, services, and expertise to the national research community, including industry and academia. Cyberinfrastructure refers to an accessible, integrated network of computer-based resources and expertise, focused on accelerating scientific inquiry and discovery. SDSC supports hundreds of multidisciplinary programs spanning a wide variety of domains, from earth sciences and biology to astrophysics, bioinformatics, and health IT. SDSC’s petascale Comet supercomputer continues to be a key resource within the National Science Foundation’s XSEDE (Extreme Science and Engineering Discovery Environment) program.