The U.S. Department of Energy announced $10 million in funding for 13 projects that will enhance sophisticated computer models for understanding weather and climate patterns.
IT experts at Monash University have devised the world’s leading post-quantum secure privacy-preserving algorithm – so powerful it can thwart attacks from supercomputers of the future.
The Department of Energy has developed a new computer simulation capability: the Energy Exascale Earth System Model. Scientists designed the model to focus on areas most relevant to energy production as well as take full advantage of DOE’s supercomputing systems.
An international team of scientists has found the first evidence of a source of high-energy cosmic neutrinos, subatomic particles that can emerge from their sources and, like cosmological ghosts, pass through the universe unscathed, traveling for billions of light years from the most extreme environments in the universe to Earth.
Through a new multi-year project involving the Department of Energy’s (DOE) Lawrence Livermore (LLNL), Lawrence Berkeley (LBNL) and Argonne (ANL) national laboratories, in collaboration with the Transforming Research and Clinical Knowledge in Traumatic Brain Injury (TRACK-TBI) consortium led by the University of California, San Francisco (UCSF), scientists and engineers plan to simultaneously challenge DOE’s supercomputing resources, advance artificial intelligence capabilities and enable a precision medicine approach for traumatic brain injury (TBI).
PET plastic, short for polyethylene terephthalate, is the fourth most-produced plastic, used to make things such as beverage bottles and carpets, most of which are not being recycled. Some scientists are hoping to change that, using supercomputers to engineer an enzyme that breaks down PET. They say it's a step on a long road toward recycling PET and other plastics into commercially valuable materials at industrial scale.
A team from Lawrence Berkeley National Laboratory (Berkeley Lab) and Lawrence Livermore National Laboratory, both U.S. Department of Energy (DOE) national labs, is leveraging powerful supercomputers to portray the impact of high-frequency ground motion on thousands of representative different-sized buildings spread out across the California region.
Supercomputers have the power to unlock the secrets of subatomic particles that are hidden deep inside everyday matter. But they can’t do it on their own: They require experts to use their knowledge of the theory subatomic to set up the problems to be calculated and provide insight into the results.
Raul Briceño has been awarded a DOE Early Career Award to do just that, as he develops and implements a first-of-its-kind universal framework for these studies.
The recent completion of the 12 GeV Upgrade of the Continuous Electron Beam Accelerator Facility has opened up a new realm for exploration of the particles and forces that give rise to our universe. Making the most of this opportunity takes collaborations of the best and brightest minds in nuclear physics applying a bit of intellectual elbow grease.
The US Department of Energy’s Oak Ridge National Laboratory is once again officially home to the fastest supercomputer in the world, according to the TOP500 List, a semiannual ranking of the world’s fastest computing systems.
Lawrence Livermore National Laboratory’s next-generation supercomputer Sierra is the third-fastest computing system in the world, according to the TOP500 list announced today at the International Supercomputing Conference in Frankfurt, Germany.
Three young scientists affiliated with Jefferson Lab win grants to support research for building better accelerators and for using Jefferson Lab’s recently upgraded accelerator and supercomputers to suss out new information about subatomic particles.
Let’s talk! Scientists demonstrate coherent coupling between a quantum dot and a donor atom in silicon, vital for moving information inside quantum computers.
The West Big Data Innovation Hub (WBDIH) at the San Diego Supercomputer Center (SDSC) at UC San Diego is one of four regional big data hubs partner sites awarded a $1.8 million grant from the National Science Foundation (NSF) for the initial development of a data storage network during the next two years.
In an effort to reduce errors in the analyses of diagnostic images by health professionals, a team of researchers from Oak Ridge National Laboratory has improved understanding of the cognitive processes involved in image interpretation, work that has enormous potential to improve health outcomes for the hundreds of thousands of American women affected by breast cancer each year. The ORNL-led team found that analyses of mammograms by radiologists were significantly influenced by context bias, or the radiologist’s previous diagnostic experiences.
Argonne material scientists have discovered a reaction that helps explain the behavior of a key electrolyte additive used to boost battery performance.
The U.S. Department of Energy’s Oak Ridge National Laboratory today unveiled Summit as the world’s most powerful and smartest scientific supercomputer.
In a study published in the May 21, 2018 issue of the Proceedings of the National Academy of Sciences, a team of researchers – aided with supercomputing resources from the San Diego Supercomputer Center (SDSC) based at UC San Diego – created a dynamic computer simulation to delineate a key biological process that allows the body to repair damaged DNA.
Forward-thinking scientists in the 1970s suggested that circuits could be built using molecules instead of wires, and over the past decades that technology has become reality. The trouble is, some molecules have particularly complex interactions that make it hard to predict which of them might be good at serving as miniature circuits. But a new paper by two University of Chicago chemists presents an innovative method that cuts computational costs and improves accuracy by calculating interactions between pairs of electrons and extrapolating those to the rest of the molecule.
A team led by Berkeley Lab researchers has enlisted powerful supercomputers to calculate a quantity, known as the “nucleon axial coupling” or gA, that is central to our understanding of a neutron’s lifetime.
Argonne joins its sister national laboratories in powering a new earth modeling system with supercomputers. The system features weather-scale resolution and can help researchers anticipate decadal-scale changes that could influence the U.S. energy sector in years to come.
Tom Jordan and a team from the Southern California Earthquake Center (SCEC) are using the supercomputing resources of the Argonne Leadership Computing Facility (ALCF), a U.S. Department of Energy Office of Science User Facility, to advance modeling for the study of earthquake risk and how to reduce it.
By stretching the amount of time proteins can be simulated in their natural state of wiggling and gyrating, a team of researchers at Colorado State University has identified a critical protein structure that could serve as a molecular Achilles heel able to inhibit the replication of dengue virus and potentially other flaviviruses such as West Nile and Zika virus.
Jen Marie Phifer and Forest Good of Los Lunas High School won top honors on Tuesday at the 28th Annual New Mexico Supercomputing Challenge held at Los Alamos National Laboratory.
The U.S. Department of Energy will fund research into a novel approach to improving efficiency of next-generation supercomputer simulations with an award to Rensselaer Polytechnic Institute doctoral candidate Caitlin Joann Ross.
U.S. Secretary of Energy Rick Perry today announced the release of a Request for Proposals (RFP) for the development of at least two new exascale supercomputers, including Lawrence Livermore National Laboratory’s next-generation system code-named “El Capitan.”
The mirror-like physics of the superconductor-insulator transition operates exactly as expected. Scientists know this to be true following the observation of a remarkable phenomenon, the existence of which was predicted three decades ago but that had eluded experimental detection until now. The observation confirms that two fundamental quantum states, superconductivity and superinsulation, both arise in mirror-like images of each other.
Berkeley Lab and Joint Genome Institute researchers took one of the most popular clustering approaches in modern biology—Markov Clustering algorithm—and modified it to run efficiently and at scale on supercomputers. Their algorithm achieved a previously impossible feat: clustering a 70 million node and 68 billion edge biological network in hours.
ORNL model could better predict tiny methylmercury pockets lurking in creek algae; engines work smarter with new fuel innovation; making narrow metallic structures to advance tiny electronics, drug delivery; certain enzymes that try to break down antibiotics may inform better drug designs for fighting resistant bacteria; current software simulations for small modular reactors upscaled to run on future supercomputers.
Tapping into the tremendous power of the Cherry Creek II supercomputer at UNLV just got easier for faculty researchers and community partners alike, thanks to a new MOU between the university and Altair Engineering.
Computers have helped researchers develop a new phosphor that can make LEDs cheaper and render colors more accurately. An international team led by engineers at UC San Diego first predicted the new phosphor using supercomputers and data mining algorithms, then developed a simple recipe to make it in the lab. Unlike many phosphors, this one is made of inexpensive, earth-abundant elements and can easily be made using industrial methods. As computers predicted, the new phosphor performed well in tests and in LED prototypes.
In a recent demonstration project, physicists from Brookhaven National Laboratory and Berkeley Lab used the Cori supercomputer at the National Energy Research Scientific Computing Center to reconstruct data collected from a nuclear physics experiment, an advance that could dramatically reduce the time it takes to make detailed data available for scientific discoveries.
Grover and GM colleagues Jian Gao, Venkatesh Gopalakrishnan, and Ramachandra Diwakar are using the Titan supercomputer at the Oak Ridge Leadership Computing Facility to improve combustion models for diesel passenger car engines with an ultimate goal of accelerating innovative engine designs while meeting strict emissions standards.
A team of networking experts from the Department of Energy’s Energy Sciences Network (ESnet), with the Globus team from the University of Chicago and Argonne National Laboratory, have designed a new approach that makes data sharing faster, more reliable and more secure.
For deep learning to be effective, existing neural networks to be modified, or novel networks designed and then "trained" so that they know precisely what to look for and can produce valid results. This is a time-consuming and difficult task, but one that a team of ORNL researchers recently demonstrated can be dramatically expedited with a capable computing system.
“We’re geeks, and we’re motivated.” That’s how Amin Amooie, a doctoral student in earth sciences at The Ohio State University, explained his team’s efforts to build the supercomputer they’ve dubbed “Buckeye Pi.”
With a top-story list populated by breakthroughs in supercomputing, accelerator science, space missions, materials science, life science, and more, Los Alamos National Laboratory put its Big Science capabilities to wide, productive use in 2017.
An international team of researchers ran multi-scale, multi-physics 2D and 3D simulations at NERSC to illustrate how heavy metals expelled from exploding supernovae held the first stars in the universe regulate subsequent star formation and influence the appearance of galaxies in the process.
New supercomputer simulations have revealed the role of transport proteins called efflux pumps in creating drug-resistance in bacteria, research that could lead to improving the drugs’ effectiveness against life-threatening diseases and restoring the efficacy of defunct antibiotics.
A unique collaboration between a music professor and an engineering professor at Virginia Tech will result in the creation of a new platform for data analysis that will make it possible to understand the significance of data by turning it into sound.
Using the Titan supercomputer, a research team at Oak Ridge National Laboratory has developed an evolutionary algorithm capable of generating custom neural networks that match or exceed the performance of handcrafted artificial intelligence systems.
For the first time, scientists have used high-performance computing (HPC) to reconstruct the data collected by a nuclear physics experiment—an advance that could dramatically reduce the time it takes to make detailed data available for scientific discoveries. The demonstration project used the Cori supercomputer at the National Energy Research Scientific Computing Center (NERSC), a high-performance computing center at Lawrence Berkeley National Laboratory in California, to reconstruct multiple datasets collected by the STAR detector during particle collisions at the Relativistic Heavy Ion Collider (RHIC), a nuclear physics research facility at Brookhaven National Laboratory in New York.
Researchers are grappling with increasingly large quantities of image-based data. Machine learning and deep learning offer researchers new ways to analyze images quickly and more efficiently than ever before. Scientists at multiple national laboratories are working together to harness the potential of these tools.