AI ignites innovation in fusion
As organizations around the world race to harness the power of nuclear fusion, advancements in computer technology may help them over the finish line.
Computational tools like artificial intelligence (AI), augmented and virtual reality (AR/VR), and digital twins are already showing their immense potential in reshaping how scientists and engineers tackle the four challenges common to nuclear fusion projects: finding materials that can resist high neutron flux, achieving steady-state operation*, managing extreme heat flux, and optimizing the fusion fuel cycle.
No longer merely hypothetical, these tools are being actively explored by both the private sector and public research programs—each sparking new insights that accelerate progress.
Speeding development in the private sector
At ITER’s 2025 Private Sector Fusion Workshop in April, one message came through loud and clear: new computing technologies may be a game changer for fusion research and development. These advances are poised for use cases across the board—from materials science and plasma control to software engineering and hardware design. They are even helping with mundane office tasks and IT support.
"AI is fundamentally changing our world in ways which are unimaginable," said Kenji Takeda, Director of Research Incubations at Microsoft Research during his talk. Takeda highlighted how generative AI models, like Microsoft's MatterGen, help discover new materials in other scientific domains by generating molecular structures that meet requirements specified through prompts, much like ChatGPT can be asked to write a poem. Then AI emulators, such as Microsoft’s MatterSim, can be used to simulate how those materials behave under different test conditions thousands of times faster than existing computational tools.
This approach could save years in experimental trial and error, and Microsoft demonstrated its usefulness by designing new materials with a bulk modulus exceeding 400 gigapascals—beyond the reach of most traditional screening methods. "This is how we begin to speak the language of nature," said Takeda. "From electrons to cells to the entire planet—AI can help us model and understand these complex systems.”
Microsoft’s contribution underscores a broader trend: industry players are applying AI not just to theoretical research but to real engineering workflows. Moreover, many are finding that what works in one domain is very likely to be useful in others.
At the ITER workshop, Thea Energy’s CTO David Gates described how digital twins and parametric CAD simplify the design of fusion reactors by generating vessel structures from plasma geometries with precise tolerances. Gates emphasized the value of machine learning—particularly physics-informed models that use real-world laws to predict behaviour and guide system control. "It’s about using actuators and sensors in harmony to stabilize plasma in real time," he said.
Complementing these efforts is the emergence of AI-native platforms that combine multiple modalities of reasoning. Arena, a New York-based AI startup, developed a platform called Atlas, an AI-native system that combines physics-based reasoning with large language models. Already used to streamline engineering processes at AMD, Atlas helps design, debug, and optimize complex systems.
Mike Frei, General Manager and Hardware Product Lead at Arena, made the case that the same platform can speed up development in fusion projects. Two things, he said, help AI models power faster hardware innovation. One is complex human-like reasoning. Models, like those that power Atlas, follow step-by-step reasoning processes to solve intricate problems. “Some of them do quite well on PhD level questions in subjects like math and physics, and some are already expert coders that can be compared with the best in the world,” Frei noted.
The second ingredient is multimodality. “A lot of context needs to be captured that isn't covered in just a simulation and isn’t simple electrical engineering,” said Frei. Atlas combines sensor data, schematics, and software instructions into a single AI interface. It integrates with lab instruments, generates firmware, and learns from every interaction to assist with future tasks.
Streamlining work in the public sector
The impact of AI is not limited to startups and private-sector experiments. At ITER, the world’s largest fusion project, these technologies are being applied across domains—from administrative to operational, and from engineering to scientific.
ITER recently digitized its extensive knowledge base—over a million documents—using a fine-tuned OpenAI model. Engineers now access this wealth of expertise via a chatbot trained on internal content, enabling a more natural interface and more intuitive ways of querying the knowledge database.
Microsoft's Copilot and GitHub Copilot have already been deployed at ITER, accelerating software development and troubleshooting. AI has been used to inspect welds on ITER’s massive tokamak vessel using computer vision, further ensuring structural integrity. Even the IT help desk has been overhauled with AI to resolve support tickets, allowing engineers to focus on more impactful work.
The effects of AI are propagating like a chain reaction across the surrounding ecosystem. Just down the road in Cadarache, the CEA WEST fusion research centre has also adopted AI-powered strategies. According to Xavier Litaudon, research director at CEA, researchers at CEA WEST have trained AI systems to automatically detect, during operation, hot spots on infrared images of the first wall, reducing the risk of damages on the plasma-facing components and helping to achieve longer plasma duration in safe and more stable conditions.
“We applied this same approach to ITER simulations,” he said during his presentation about his organization’s recent series of groundbreaking fusion experiments, culminating in a 22-minute-long pulse. AI improves operational safety, reproducibility, reliability and performance by learning from historical pulse behaviour.
AI will play an important role once experimental data is produced by ITER plasmas. According to Alberto Loarte, head of the ITER Science Division, AI will be used to provide consistency checks of acquired data—and as the data is being produced it can provide a higher fidelity interpretation of the measurements than what can be achieved through traditional schemes. As the database of information from real experiments at ITER grows, AI will be used to explore different ways of improving plasma performance.
“But we do not need to wait until ITER produces experimental results,” says Loarte. “AI is being used today to generate fast and accurate models of physics processes that allow a high-fidelity description of ITER plasma behaviour with a quick turnaround time. These AI-based models can be used to develop a full digital twin of ITER both for the engineering systems as well as for the plasma scenarios and the associated physics processes, which will be used to simulate ITER operation as a complete and realistic system.”
ITER recently developed a live digital twin of its plant, integrating drone imagery, 3D scans, and engineering documentation. This immersive model is accessible via tablets and VR headsets, allowing engineers to compare real-time construction with design and flag deviations instantly.
Taken together, these examples illustrate how computing tools—whether machine learning, generative AI, or immersive modelling—are streamlining work at every level. The consensus is clear: computing tools are making fusion research more agile, collaborative, and precise. Fusion may still be a long way from lighting homes, but with AI and other digital technologies, it might get there faster.
*Steady-state operation: producing fusion power for an unlimited amount of time with high fusion gain.