Newswise — Designing materials with certain properties is the first step to making computer chips that can store more information, superconductors that could help to solve the world’s energy problems, and drugs that work more efficiently in the human body. The transition metals in the periodic table of elements are crucial to confronting these problems because they are often the important ingredient in creating the materials that these advanced technologies rely upon.

Transition metals are the elements that lie in the middle of the periodic table. They have negatively charged electrons in their outermost shells that are not so tightly bound and therefore easily shared with other elements to form molecules. Transition metal systems consist of atoms of these transition metals bound to other elements. When electrical current is run through them, or when they are exposed to changes in temperature or other facets of their environment, the physical and chemical reactions that occur become complicated. Scientists are studying these systems to explain a variety of phenomena, one of which is to find out how much energy it takes to pull the bound molecules away from the transition metals. This would help them create more predictable systems, leading to the production of better materials much faster.

A team led by Shiwei Zhang, senior research scientist and group leader at the Center for Computational Quantum Physics at the Flatiron Institute and also Chancellor Professor of Physics at the College of William & Mary, is employing the IBM AC922 Summit supercomputer at the Oak Ridge Leadership Computing Facility (OLCF) to simulate transition metal systems, such as copper bound to molecules of nitrogen, dihydrogen, or water. The OLCF is a US Department of Energy (DOE) Office of Science User Facility at DOE’s Oak Ridge National Laboratory

In a recent study published in the Journal of Chemical Theory and Computation, the team used Summit and a method created by Zhang and collaborators to correctly predict the amount of energy required to break apart dozens of molecular systems, paving the way for a greater understanding of these materials.

“Being able to accurately predict the physics and chemistry of these materials is important for both fundamental understanding and for applications,” Zhang said. “For example, we want to know what will happen if we put one of these transition metals in a certain environment—what would happen to the metal and the molecules surrounding it.”

The researchers compared the results with other computational methods and experiments and demonstrated that their method provided the most accurate results for the systems studied. The researchers hope that other scientists will use their data set as a reference for experimental studies and further validate or improve other computational methods.

An electron boogie

Transition metals such as copper are special because some of their electrons are squeezed into a tightly packed region. This makes the transition metals more reactive—and often magnetic. 

“Transition metal complexes are ubiquitous in biology and chemical catalysis,” Zhang said. “The volatility of these metals causes different reactions to happen. When you put them in different environments, they will have many different interactions with other atoms or molecules.”

The electrons in the transition metals operate much like dancers on a packed dance floor—they have to navigate around other electrons, meaning their actions are strongly correlated with one another. It is difficult to isolate and study just one electron because each one’s actions are affected by neighboring electrons.

Predicting what will happen within entire systems of transition metals is even harder. Some methods can predict what might happen in small molecules, but using these same methods to study transition metal complexes is impossible due to the exponential growth in the number of correlated electrons—the dance floor becomes very crowded.

To get around this problem, Zhang and coworkers developed a method called phaseless auxiliary-field quantum Monte Carlo (ph-AFQMC). In this method, researchers reformulate the problem in a space made of numerous fictitious fields that serve as force carriers for the electrons. By using random walks and performing many iterations in this space, scientists can get approximate but very accurate solutions for problems in quantum physics.

The original AFQMC codes were developed at the College of William & Mary in the early 2000s. In 2018, Zhang’s team developed the ph-AFQMC version of the code to efficiently utilize GPUs. The researchers then applied it to 40 two-molecule systems on Summit under the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program at the OLCF last year. In the current project, the team has accurately simulated the properties of 34 transition metal complexes—compounds consisting of transition metals bound to two-, four-, and even six-atom molecules. In their bound state, these molecules are known as ligands.

“To replicate the realistic situations that you see in something like drug design is very computationally expensive,” Zhang said. “This is one step closer to being able to fully model the kinds of systems we’d like to study.” 

The demand for leadership computing resources, such as those on par with or above Summit’s capabilities, is insatiable. 

“We really need to be able to model systems of dozens or even hundreds of atoms to fully understand what would happen when these transition metal complexes are put in realistic environments,” Zhang said.

A segue to superconductors

The team hopes that its results will inspire others to develop cheaper methods to predict the properties of even more complex materials in the future.

“We are providing a reference, so that people can trust that these are the reliable answers for this set of molecules,” Zhang said. “We wanted to have this data set so that people can test additional methods—cheaper methods—that might achieve a good enough accuracy to be able to then treat bigger problems.”

One of the bigger problems the team members themselves are anxious to tackle is high-temperature superconductors—ceramic materials that, when cooled, become materials that conduct electrical current without loss. About 5 percent of the electricity that flows to homes and businesses in the United States is currently lost because ordinary wires lose energy through heating up.

“The work we did with these transition metal systems is a motif—a key ingredient to working toward superconductors,” Zhang said.

This work also used resources of the San Diego Supercomputing Center, including the Comet supercomputer, a dedicated eXtreme Science and Engineering Discovery Environment cluster. 

Related Publication: Benjamin Rudshteyn, Dilek Coskun, John L. Weber, Evan J. Arthur, Shiwei Zhang, David R. Reichman, Richard A. Friesner, and James Shee, “Predicting Ligand-Dissociation Energies of 3d Coordination Complexes with Auxiliary-Field Quantum Monte Carlo,” Journal of Chemical Theory and Computation 16, no. 5 (2020): 3041–3054, doi:10.1021/acs.jctc.0c00070.

Journal Link: Journal of Chemical Theory and Computation, Apr-2020