For two decades, physicists have been trying to reconcile a gap between theoretical and experimental data on a particle called the muon. A new study, powered by Argonne's supercomputer Mira, sharpens one piece of the puzzle.
Two decades ago, an experiment at Brookhaven National Laboratory pinpointed a mysterious mismatch between established particle physics theory and actual lab measurements. A multi-institutional research team (including Brookhaven, Columbia University, and the universities of Connecticut, Nagoya and Regensburg, RIKEN) have used Argonne National Laboratory’s Mira supercomputer to help narrow down the possible explanations for the discrepancy, delivering a newly precise theoretical calculation that refines one piece of this very complex puzzle.
University of Alabama in Huntsville (UAH) professor of biological science Dr. Jerome Baudry is collaborating with Hewlett Packard Enterprise (HPE) to use HPE’s Cray Sentinel supercomputer to search for natural products that are effective against the COVID-19 virus.
Quantum machine learning, an emerging field that combines machine learning and quantum physics, is the focus of research to discover possible treatments for COVID-19, according to Penn State researchers led by Swaroop Ghosh, the Joseph R. and Janice M. Monkowski Career Development Assistant Professor of Electrical Engineering and Computer Science and Engineering. The researchers believe that this method could be faster and more economical than the current methods used for drug discovery.
For an experiment that will generate big data at unprecedented rates, physicists led design, development, mass production and delivery of an upgrade of novel particle detectors and state-of-the art electronics.
Scientists and engineers at Fermilab and Brookhaven are uniting with other organizations in the Open Science Grid to help fight COVID-19 by dedicating considerable computational power to researchers studying how they can help combat the virus-borne disease.
An ORNL team developed the XACC software framework to help researchers harness the potential power of quantum processing units, or QPUs. XACC offloads portions of quantum-classical computing workloads from the host CPU to an attached quantum accelerator, which calculates results and sends them back to the original system.
This is a continuing profile series on the directors of the Department of Energy (DOE) Office of Science User Facilities. Michael E. Papka is the director of the Argonne Leadership Computing Facility.
To assist in the COVID-19 research effort, Lawrence Livermore National Laboratory, Penguin Computing and AMD have reached an agreement to upgrade the Lab’s unclassified, Penguin Computing-built Corona high performance computing (HPC) cluster with an in-kind contribution of cutting-edge AMD Instinct™ accelerators, expected to nearly double the peak performance of the machine.
Profiled is Mitch Allmond of Oak Ridge National Laboratory, who conducts experiments and uses theoretical models to advance our understanding of the structure of atomic nuclei.
University of California San Diego researchers have ported the popular UniFrac microbiome tool to graphic processing units (GPUs) in a bid to increase the acceleration and accuracy of scientific discovery, including urgently needed COVID-19 research.
The INCITE program is now seeking proposals for high-impact, computationally intensive research projects that require the power and scale of DOE’s leadership-class supercomputers.
In the race to identify solutions to the COVID-19 pandemic, researchers at the Department of Energy’s Oak Ridge National Laboratory are joining the fight by applying expertise in computational science, advanced manufacturing, data science and neutron science.
Researchers at UC San Diego recently created a pharmacophore model and conducted data mining of the conformational database of FDA-approved drugs that identifies 64 compounds as potential inhibitors of the COVID-19 protease. Among the selected compounds are two HIV protease inhibitors, two hepatitis C protease inhibitors, and three drugs that have already shown positive results in testing with COVID-19.
A new data-driven mathematical model of the coronavirus pandemic predicts that the United States will peak in the number of “active” COVID-19 cases on or around April 20, marking a critical milestone on the demand for medical resources.
Los Alamos National Laboratory’s Efficient Mission Centric Computing Consortium (EMC3) recently welcomed its first international partner, the South African National Integrated Cyberinfrastructure System (NICIS).
Research is underway at The University of Texas at El Paso’s School of Pharmacy to develop vaccines and antiviral drugs to combat the novel coronavirus within 15 months to two years.
The San Diego Supercomputer Center at the University of California San Diego is providing priority access to its high-performance computer systems and other resources to researchers advancing our understanding of the virus and efforts to develop an effective vaccine in as short a time as possible.
Researchers at Sand Diego State University and the Polytechnic University of Turin in Italy used supercomputer simulations to study how ocean wave energy converters can harness energy and turn it into into electricity, offering the potential to reduce our reliance on fossil fuels.
A team of Stony Brook University (SBU) researchers is working on computer models that could help speed the discovery of drugs to combat the novel coronavirus responsible for COVID-19. They are doing this work in collaboration with scientists at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory and Argonne National Laboratory, and will be leveraging those laboratories’ computational resources and expertise.
Argonne National Laboratory’s Illinois Mathematics and Science Academy (IMSA) High School Internship Program has this year’s exceptionally bright high school students working on the Deep Underground Neutrino Experiment (DUNE)’s world-changing research.
With the world’s most powerful path-to-exascale supercomputing resources at their disposal, William Tang and colleagues are combining computer muscle and AI to eliminate disruption of fusion reactions in the production of sustainable clean energy.
With an estimated 1.7 million new cases and 600,000 deaths during 2017 in the U.S. alone, cancer remains a critical healthcare challenge. Researchers used the Comet supercomputer at the San Diego Supercomputer Center (SDSC) to evaluate their new molecular docking tool which aims to improve immunotherapy outcomes by identifying more effective personalized treatments.
Lawrence Livermore National Laboratory scientists are contributing to the global fight against COVID-19 by combining artificial intelligence/machine learning, bioinformatics and supercomputing to help discover candidates for new antibodies and pharmaceutical drugs to combat the disease.
A professor in the Department of Biological Sciences at The University of Alabama in Huntsville (UAH) is part of an effort led by Oak Ridge National Laboratory (ORNL) in Tennessee that applies the power of supercomputers to screen compounds for effectiveness against the pandemic COVID-19 virus.
The U.S. Department of Energy (DOE) announced a plan to provide $60 million to establish multidisciplinary teams to develop new tools and techniques to harness supercomputers for scientific discovery.
The Department of Energy has a vital role to play in the national response to COVID-19. Researchers have already used tools at national laboratories to make major inroads to analyzing the virus and its spread.
Researchers at the Department of Energy’s Oak Ridge National Laboratory have used Summit, the world’s most powerful and smartest supercomputer, to identify 77 small-molecule drug compounds that might warrant further study in the fight against the SARS-CoV-2 coronavirus, which is responsible for the COVID-19 disease outbreak.
Lawrence Livermore National Laboratory (LLNL), Hewlett Packard Enterprise (HPE) and Advanced Micro Devices, Inc. (AMD) today announced the selection of AMD as the node supplier for El Capitan, projected to be the world’s most powerful supercomputer when it is fully deployed in 2023.
Researchers at the University of Rhode Island (URI) used San Diego Supercomputer Center’s (SDSC) Comet supercomputer to show that high-performance computer modeling can accurately simulate tsunamis from volcanic events. Such models could lead to early-warning systems that could save lives and help minimize catastrophic property damage.
Valentino Cooper of Oak Ridge National Laboratory uses theory, modeling and computation to improve fundamental understanding of advanced materials for next-generation energy and information technologies.
Lawrence Berkeley National Laboratory's decades of leadership in designing & enhancing energy-efficient data centers is being applied to NERSC supercomputing resources through a collaboration that's using operational data analytics to optimize cooling systems & save electricity.
To better leverage cancer data for research, scientists at ORNL are developing an artificial intelligence (AI)-based natural language processing tool to improve information extraction from textual pathology reports. In a first for cancer pathology reports, the team developed a multitask convolutional neural network (CNN)—a deep learning model that learns to perform tasks, such as identifying key words in a body of text, by processing language as a two-dimensional numerical dataset.
Researchers at the San Diego Supercomputer Center (SDSC) and the Wisconsin IceCube Particle Astrophysics Center (WIPAC) have conducted a second experimentt marshalled globally available-for-sale GPUs (graphics processing units), proving it is possible to elastically burst to very large scales of GPUs using the cloud, even in this pre-exascale era of computing.
The Protein Data Bank archive, which contains more than 160,000 3D structures for proteins, DNA, and RNA, this month released a new Coronavirus protease structure following the recent coronavirus outbreak, an ongoing viral epidemic primarily affecting mainland China that now threatens to spread to populations in other parts of the world.
Rice University engineers have created a deep learning computer system that taught itself to accurately predict extreme weather events, like heat waves, up to five days in advance using minimal information about current weather conditions.
Researchers at Oregon State University have been using the Comet supercomputer at the San Diego Supercomputer Center to test an algorithm that they believe will reduce errors in the widely used three-day forecasts for water temperature, salinity levels, sea heights, and currents off the coasts of Oregon and Washington.
An international team of researchers has discovered the hydrogen atoms in a metal hydride material are much more tightly spaced than had been predicted for decades—a feature that could possibly facilitate superconductivity at or near room temperature and pressure. The scientists conducted neutron scattering experiments at the Department of Energy’s Oak Ridge National Laboratory on samples of zirconium vanadium hydride.
Using supercomputer simulations and a large dataset of materials, scientists found a connection between distortions in the material’s atomic structure and the amount of energy required to separate a proton from the material.
Just over a year after Los Alamos National Laboratory launched the Efficient Mission Centric Computing Consortium (EMC3), 15 companies, universities and federal organizations are now working together to explore new ways to make extreme-scale computers more efficient.
Candace Culhane, a program/project director in Los Alamos National Laboratory’s Directorate for Simulation and Computation, has been selected as the general chair for the 2022 SC Conference (SC22).
A new study published late last year in the Monthly Notices of the Royal Astronomical Society explored the molecular gas within and surrounding the intracluster medium, which fills the space between galaxies in a galaxy cluster.
A team of quantum researchers from ORNL have conducted a series of experiments to gain a better understanding of quantum mechanics and pursue advances in quantum networking and quantum computing, which could lead to practical applications in cybersecurity and other areas.
However, San Diego Supercomputer Center (SDSC) Research Scientist Igor Tsigelny recently collaborated Researchers from the San Diego Supercomputer at UC San Diego and colleagues from Sweden’s Karolinska Institute and the Pasteur Institute in France released a study focused on improving the prognosis for glioblastoma patients.
A comprehensive analysis of 10,575 genomes as part of a multi-national study led by researchers at UC San Diego has revealed close evolutionary proximity between the microbial domains at the base of the tree of life, the branching pattern of evolution described by Charles Darwin more than 160 years ago in his book, On the Origin of Species.
PPPL will use INCITE-award time on Summit and Theta supercomputers to develop predictions for the performance of ITER, the international experiment under construction to demonstrate the feasibility of fusion energy.
The American Association for Thoracic Surgery (AATS) has adopted an open-source, cloud-based platform led out of the San Diego Supercomputer Center (SDSC) that addresses widely recognized challenges with historical platforms throughout the cardiothoracic surgical community.