Computer calculations by the Center for Solar Fuels, an Energy Frontier Research Center, shed light on nebulous interactions in semiconductors relevant to dye-sensitized solar cells.
A catalytic reaction may follow thousands of possible paths, and it can take years to identify which one it actually takes so scientists can tweak it and make it more efficient. Now researchers at the Department of Energy’s SLAC National Accelerator Laboratory and Stanford University have taken a big step toward cutting through this thicket of possibilities.
In a newly published Science paper, Argonne and Temple University researchers reveal new knowledge about the behavior of metal nanoparticles when they undergo oxidation, by integrating X-ray imaging and computer modeling and simulation. This knowledge adds to our understanding of fundamental processes like oxidation and corrosion.
High performance computing researcher Shuaiwen Leon Song asked if hardware called 3D stacked memory could do something it was never designed to do—help render 3D graphics.
Working with a consortium of leading software and hardware vendors and research organizations, the Lab will help evolve the standard for OpenMP, the most popularly used shared-memory parallel programming model.
San Diego-area high school students interested in pursuing a career in scientifically-based research are invited to apply to UC San Diego’s Mentor Assistance Program (MAP), a campus-wide initiative designed to engage students in a mentoring relationship with an expert from a vast array of disciplines.
A multi-institutional team used resources at the Oak Ridge Leadership Computing Facility to catalog how desert plants photosynthetic processes vary. The study could help scientists engineer drought-resistant crops for food and fuel.
The Department of Energy’s INCITE program will be accepting proposals for high-impact, computationally intensive research campaigns in a broad array of science, engineering, and computer science domains.
Wu, a theoretical chemist at Brookhaven Lab’s Center for Functional Nanomaterials (CFN), performs calculations and simulations and constructs models that provide a fundamental understanding of the structures, dynamics, and properties of chemical systems.
Stony Brook University-led research team through the Laufer Center for Physical and Quantitative Biology has created a user-friendly automated computer server that calculates complex computations of modeling protein interactions with a handful of clicks from a home computer.
Two biophysicists from Case Western Reserve University School of Medicine have used supercomputers to show how cell membranes control the shape, and consequently the function, of a major cancer-causing protein.
ORNL-led team joins quantum, high-performance and neuromorphic computing architectures that could yield more flexible, efficient intelligent computing; ORNL uses electron beam precision to instantly adhere coatings for lithium-ion batteries; ORNL’s high-res tools look closely at plant makeup for more efficient, less costly biomass breakdown.
Reader preferences for liberal or conservative political books also attract them to different types of science books, according to a new study. The result supports observations that the divisiveness of politics in the United States has spread to scientific communication as well, endangering the role of science as politically neutral ground.
Brookhaven Lab computer scientist Wei Xu develops visual analytics tools, which provide a bridge between advanced computational capabilities and human knowledge and judgment.
This year’s week-long “Summer Institute” workshop held by the San Diego Supercomputer Center (SDSC) at UC San Diego will focus on a wide range of introductory-to-intermediate topics in high-performance computing (HPC) and data science for researchers in academia and industry, especially those in domains that have not traditionally used HPC resources.
Much like two friendly neighbors getting together to chat over a cup of coffee, the minuscule particles in our sub-atomic world also come together to engage in a kind of conversation. Now, nuclear scientists are developing tools to allow them to listen in on the particles’ gab fests and learn more about how they stick together to build our visible universe. The first complex calculations of a particle called the sigma have been carried out and published in Physical Review Letters.
University of Virginia professor Leonid Zhigilei led a team that used the Titan supercomputer to gain deeper insights into laser interactions with metal surfaces.
Berkeley Lab researchers have successfully added thread-level parallelism on top of MPI-level parallelism in the planewave density functional theory method within the popular software suite NWChem. An important step to ensuring that computational chemists are prepared to compute efficiently on next-generation exascale machines.
When Globus Genomics launched five years ago, biologists were just getting used to the idea of being a “big data” science. At that time, the rapidly falling costs of next-generation sequencing suddenly made large-scale genetics more accessible to life scientists. However, these new methods also brought new challenges, as researchers used to working with small datasets on their desktop computer were faced for the first time with the kind of hard-drive flooding data streams more commonly seen by physicists and astronomers.
A new computer simulation helps explain the existence of puzzling supermassive black holes observed in the early universe. The simulation is based on a computer code used to understand the coupling of radiation and certain materials.
Researchers from Ames Laboratory used supercomputers at NERSC to evaluate a novel approach for creating more energy-efficient ultra-thin crystalline silicon solar cells by optimizing nanophotonic light trapping.
The San Diego Supercomputer Center (SDSC) at the University of California San Diego and the Simons Foundation’s Flatiron Institute in New York have reached an agreement under which the majority of SDSC’s data-intensive Gordon supercomputer will be used by Simons for ongoing research following completion of the system’s tenure as a National Science Foundation (NSF) resource on March 31.
Argonne researchers have developed a new theoretical approach, ideally suited for high-performance computing systems, capable of making predictive calculations about particle interactions that conform almost exactly to experimental data. This new approach could give scientists a valuable tool for describing new physics and particles beyond those currently identified.
Is it better to treat aneurysms with two overlapping flow diverters, or one compressed diverter? A computational study published in the American Journal of Neuroradiology points to the single, compressed diverter provided that it produces a mesh denser than the two overlapped diverters, and that it covers at least half of the aneurysm opening. The ongoing research could eventually help doctors determine the best way to treat patients suffering from aneurysms.
Researchers working with magnetic nanoparticles approached computational scientists at DOE’s Oak Ridge National Laboratory to help solve a unique problem: to model magnetism at the atomic level using experimental data from a real nanoparticle.
Comet, the petascale supercomputer at the San Diego Supercomputer Center (SDSC), an Organized Research Unit of UC San Diego, has easily surpassed its target of serving at least 10,000 researchers across a diverse range of science disciplines, from astrophysics to redrawing the “tree of life”.
To better understand the physical conditions that create superluminious supernova, astrophysicists are running 2D simulations of these events using supercomputers at National Energy Research Scientific Computing Center (NERSC) and the Lawrence Berkeley National Laboratory (Berkeley Lab) developed CASTRO code.
A team led by the California Institute of Technology’s (Caltech’s) Thomas Miller used the Cray XK7 Titan supercomputer at the US Department of Energy’s (DOE’s) Oak Ridge National Laboratory (ORNL) to identify potential electrolyte materials and predict which ones could enhance the performance of lithium-ion batteries. Using Titan, the researchers ran hundreds of simulations—each consisting of thousands of atoms—on possible new electrolytes. The work led them to the identification of new electrolytes with promising properties for lithium-ion conduction.
As part of her team’s research into matter’s tendency to self-organize, Sharon Glotzer of the University of Michigan ran a series of hard particle simulations to study melting in two-dimensional (2-D) systems. Specifically, the team explored how particle shape affects the physics of a 2-D solid-to-fluid melting transition.
With the help of the Argonne Leadership Computing Facility’s Mira supercomputer, scientists have successfully designed and verified stable versions of synthetic peptides, components that join together to form proteins.
Information scientist Line Pouchard just joined Brookhaven Lab’s Center for Data-Driven Discovery, where she will help scientists discover, integrate, and re-use data.
In the January 20, 2017 issue of Science, University of Washington-led team, in collaboration with researchers at the DOE Joint Genome Institute, reports that structural models have been generated for 12 percent of the protein families that had previously had no structural information available.
Join a video conference to learn everything you need to know to get your research project up and running on Argonne Leadership Computing Facility (ALCF) systems.
Researchers identify patterns that could be valuable resource for superconductivity research; ORNL researchers developing approaches to preserve forests, wildlife; ORNL supercomputer helping scientists push boundaries; New measurement technique opens pathway to new graphene-based energy, electronic applications; ORNL cryogenic memory cell circuit could advance pathway to quantum computing;
KC Claffy, principal investigator and founding director of the Center for Applied Internet Data Analysis (CAIDA) at the San Diego Supercomputer Center (SDSC), has been named to the second annual “10 Women in Networking/Communications That You Should Know” list. The list is compiled and coordinated by N2 Women (Networking/Networking Women), a discipline-specific community for researchers in the communications and networking research fields.
A paper released December 15 during the American Geophysical Union fall meeting points to new evidence of human influence on extreme weather events. After examining observational and simulated temperature and heat indexes, the research team—which included three scientists from Lawrence Berkeley National Laboratory—concluded that two separate deadly heat waves that occurred in India and Pakistan in the summer of 2015 “were exacerbated by anthropogenic climate change.”
An international collaboration between physicists at the University of Chicago, Argonne National Laboratory, McGill University, and the University of Konstanz recently demonstrated a new framework for faster control of a quantum bit. First published online Nov. 28, 2016, in Nature Physics, their experiments on a single electron in a diamond chip could create quantum devices that are less to prone to errors when operated at high speeds.
A research team led by Jefferson Lab’s Robert Edwards has been using computation to inform GlueX experiments at Jefferson Lab as well as corroborate experimental findings.
Water and atmospheric processes are inseparable. Now, there is a supercomputer model that couples climate and hydrodynamic factors for the Great Lakes region. The new model will be useful for climate predictions, habitat modeling for invasive species, oil spill mitigation and other environmental research.
Brookhaven Lab purchased a new institutional cluster, is building a new computing architecture test bed, and joined/is in the process of joining computing standardization groups. These efforts, part of Brookhaven's Computational Science Initiative, will support data-driven scientific discoveries.
For the third time in its history, Thomas Jefferson National Accelerator Facility is home to one of the world’s 500 fastest supercomputers. The SciPhi-XVI supercomputer was just listed as a TOP500 Supercomputer Site on November 14, placing 397th on the 48th edition of the list of the world’s top supercomputers.
At DOE's computing centers, researchers work with user support teams to get the best performance from supercomputers. The members of the support team are curious, driven scientists who have taken on the challenge of some of the world's most complex computers.