Newswise — Everyone knows that the Computer—an artificial intelligence (AI)-like entity—on a Star Trek spaceship does everything from brewing tea to compiling complex analyses of flux data. But how are they used at real research facilities? How can AI agents—computer programs that can act based on a perceived environment—help scientists discover next-generation batteries or quantum materials? Three staff members at the National Synchrotron Light Source II (NSLS-II) described how AI agents support scientists using the facility’s research tools. As a U.S. Department of Energy’s (DOE) Office of Science user facility located at DOE’s Brookhaven National Laboratory, NSLS-II offers its experimental capabilities to scientists from all over the world who use it to reveal the mysteries of materials for tomorrow’s technology.
From improving experimental conditions to enhancing data quality, Andi Barbour, Dan Olds, Maksim Rakitin, and their colleagues are working on various AI projects at NSLS-II. A recent overview publication in Digital Discovery outlines several—but not all—ongoing AI projects at the facility.
First contact with AI
While movies often show AI agents as sentient super computers that can perform various tasks, real-world AI agents differ greatly from this portrayal.
“What we mean when we say AI is that we come up with an algorithm or a method—basically some mathematical process—that is going to do a ‘thing’ for us, such as classifying, analyzing, or making decisions, but we're not going to hardcode the logic,” explained Olds, a physicist who works at one of NSLS-II’s scientific instruments that enables a wide range of research projects. The instruments at NSLS-II are called beamlines because they are a combination of an x-ray beam delivery system and an experimental station.
Rakitin, a physicist specialized in developing software to collect or analyze data at NSLS-II, added, “Instead of giving the program—the AI agent—a model, it builds its own model through training. If we want it to recognize a cat, we show it a cat instead of explaining that it is a furry animal with four legs, pointy ears, a tail, and so on. The program has to figure out how to identify a cat by itself.”
Researchers at facilities such as NSLS-II have two main reasons for adapting AI agents to their needs: the sheer volume of data and its complexity. Twenty years ago, it took several minutes to snap a data image—such as a diffraction pattern—of a battery. Now, at the beamline Olds works at, they can take the same shot in a fraction of a second. While this allows more research to happen at the beamline, it outpaces the traditional strategies used to analyze the data.
Barbour, a chemical physicist, faces the second challenge, complex data, in her work studying dynamics in quantum materials. Together with her collaborators, she investigates how the atomic and electronic order in these materials evolve under variable conditions.
“When we do experiments at the beamline, we are looking for correlations and patterns in the data over time. So, if we would need to write one long program that captures all the possibilities of our experiments, it would be incredibly complicated, hard to read, terrible to maintain, and a nightmare to automate. But an AI tool can learn how to handle our complex data without the need to explain every detail to the agent,” Barbour said.
Engage AI agent for optimization
But before any experiment can start, the x-ray beam needs to be prepared by adjusting the various optical components in a beamline. Small but precise motors allow the researchers to move each individual component as needed. There are motors that rotate mirrors to guide the x-rays, more motors that move lenses to focus the light, and even more motors that control slits to shape the beam. Together, all these parts provide the perfect x-ray beam for the experiment. The better the beam fits the experiment, the better the data quality for the researchers. However, finding this perfect beam isn’t easy. In fact, researchers—such as Rakitin—call it a multidimensional optimization problem.
“Instead of tweaking every motor for every data set, our project is to develop an AI agent that can do the tweaking for us automatically. The goal is to give the AI program the shape and/or intensity of the beam we need, and it will figure out how to change the position of each motor to achieve it. This significantly cuts down the time to get the experiment started,” said Rakitin about a project presented at the 14th International Conference on Synchrotron Radiation Instrumentation (link to proceeding expected in October 2022).
Rakitin and his team members are actually striving to create a virtual beamline that allows users to figure out the best beam conditions for their experiment prior to arriving at the facility. To achieve that, he maps each motor’s behavior to specific parameters that represent physical properties—such as mirror radii—in a simulation of the beamline. The simulation is developed in a software called Sirepo. A first study on this idea was published in 2020 in the SPIE conference proceedings.
“While the users can use these beamline simulations to learn how to run a beamline, we can also use it to plan new ones. We can prepare the simulation based on the designs for the beamline even before the physical pieces are put together. Once the beamline is ready, we can begin the mapping process of the motors to the specific parameters in the simulation,” said Rakitin.
Currently, NSLS-II has 28 beamlines, however, the facility can support roughly additional 30 beamlines. Rakitin expects a number of new beamlines to use the tool during the development process.
Set AI to stun
One of those 28 beamlines is an x-ray diffraction beamline called the Pair Distribution Function (PDF) beamline, where Olds works. It serves many users for high-throughput total scattering structural studies aimed at understanding the structure-property relationships in materials from new batteries to “green” cement. The ever-changing nature of research questions at PDF challenges Olds in the search for the best measurement strategy for each experiment. To enhance the measurements, Olds is developing various AI agents that monitor data, measure it, and analyze it—like a digital lab assistant.
“The main question that drives our AI work is how we can make the best use of any experiment because time at a beamline is a precious, limited resource. Once the experiment is over, you have all the time in the world to analyze the data. But during the experiment, it’s crucial not to miss an important change in your material that could affect the discovery you are trying to make. You want tools that can help you make better decisions like when to slow down a heating ramp because you are approaching an interesting data point, or even alert you that a measurement has completed sooner than anticipated. This is where our ‘federation’ of AI lab assistants comes into play. They monitor the data. They do some real-time analysis. They watch the trends. And then when something happens, they call out. They focus our—the human researchers’—attention on the right detail so that we don’t miss it. The AI agents help to make sure we are doing the best science we can,” explained Olds.
When asked for an example, Dan recounted the events of an experiment. The researchers came to NSLS-II to understand the breakdown of a gas filtration material. Together with Olds, they set up the materials in a stream of gas, while snapping an x-ray photo every second. Each snap created a pattern of bright and dark rings (a diffraction pattern). Encoded in these changing rings lies information about how the atoms are arranged in the material at that moment in time. While the measurement was running, one of the AI agents perked up, indicating something had started to change.
“So, we checked but didn’t see anything. We were still new at this. So, we wondered, ‘can we trust the AI agent?’ But within the hour it became clear that the process we were looking for had started. The beautiful white powder we placed in the beamline was breaking down. All we found after the experiment was this ugly black crisp. Once the experiment was complete, we ran a traditional analysis of the data and found that the process had started when the AI agent chirped up. That just blew me away, because the changes at the beginning are tiny. Our AI was more sensitive than we all expected,” Olds said. He pointed to two publications (a conference proceeding and an Applied Physics Review paper) about the team’s recent AI work.
Computer, can you clean-up my data?
While Rakitin’s tool will help prior to an experiment and Olds specialized in enhancing experiments with AI, Barbour uses her AI project to improve the quality of her data after the experiment.
“The aim is to design a first pass for the analysis. The scientific problems we are looking at are all dynamic. Whenever you are looking for changes in your data, you need to be careful because your sample is not the only thing changing. There is detector noise, fluctuations in your x-ray beam and more. All of these make it harder to extract dynamics,” Barbour said.
To see these changes within materials, Barbour works with her colleagues at two instruments, the Coherent Soft X-ray Scattering (CSX) and Coherent Hard X-ray Scattering (CHX) beamlines. In both cases, the x-ray beam hits the sample, scattering across the detector in a pattern that depends on its inner structure. However, Barbour is interested in a specific portion of the scattered beam—the coherent one. Because only that will create the specific pattern—called a speckle pattern—that she needs to calculate the correlations. This technique, known as x-ray photon correlation spectroscopy (XPCS), allows Barbour to compare the different patterns within a whole series of shots. Each shot can hold similarities to the following ones, and it’s these correlations Barbour is looking for. They reveal how the material evolves over time.
“To make a good correlation, you need a series of consecutive images with no noise, no instability, and lots of x-rays. But to accomplish this with real-world data, you would need to look at every single image to remove all the ‘bad stuff.’ It’s time consuming. This is why we developed an AI agent that does two things for us: it removes the noise, and it targets the specific dynamic we are looking for. Once we have removed the noise, we can do the traditional analysis faster," Barbour explained. In her recent publications, the team shows the different between the raw, pixelated data images and the de-noised images.
She continued, “After we have de-noised the data, we use an AI method on the correlations we computed to pull out the information we’re seeking. They are called the dynamic time constants. This time, we did it for all of them. Nobody does that! Why? Because without the AI agent, it would take a complex algorithm producing fits with high uncertainties, while needing a lot of computing power. However, by analyzing the correlations with the finest time resolution, we created insights that we couldn’t access before. Thanks to this process, we could provide our findings to the theorists in a form that is more easily compared to theoretical models.” More about this can be found in team’s most recent publication.
I’m an AI agent, not a human scientist
If AI agents can align beamlines, monitor data streams, recognize chemical changes in materials, and de-noise data, will they replace humans as researchers some day? The three researchers all agreed that the answers to this question was “no.”
“I’d like to say that using AI agents—treating them as black boxes to get answers—is the ultimate goal. But just like when you start chemistry class, you need to work out the entire problem. You don’t write down an answer. You think about the numbers you’ve got. You ask, ‘does this make sense?’ And this also needs to happen with AI agents. We—the scientists—need to check if what the AI program produced makes sense,” explained Barbour.
“There are always false positives or similar things when you work with AI. The model might think it has predicted something, but it actually didn’t. So, you need an expert to look over its shoulder,” Rakitin continued.
Olds nodded as he added, “I think what makes AI special is that we ask the computer to sort out the math for us. That’s pretty profound, but ultimately is a new tool for our repertoire in the same way that computers were. Humanity did science before computers. But with them we do it more efficiently and quicker. The same is true for many other technologies. It opens the door to things that you couldn’t do before, but it doesn’t mean that we’re doing away with scientists. It just let the scientists do their work more efficiently.”
Looking forward, all three scientists agreed that the future of science will have researchers using AI agents to enhance their work in many aspects. Not just one AI like the ship computer in Star Trek, but many specialized agents, taking care of time-consuming, complex tasks. They are a new tool in the toolbox of the researchers—just like screwdrivers, test tubes, and computers—improving our researchers’ ability to do science.
This work was supported by the DOE Office of Science.
Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov