Subscribe options

Select your newsletters:

Please enter your email address:

@

Your email address will only be used for the purpose of sending you the ITER Organization publication(s) that you have requested. ITER Organization will not transfer your email address or other personal data to any other party or use it for commercial purposes.

If you change your mind, you can easily unsubscribe by clicking the unsubscribe option at the bottom of an email you've received from ITER Organization.

For more information, see our Privacy policy.

News & Media

Latest ITER Newsline

  • Vacuum components | Shake, rattle, and... qualify!

    A public-private testing partnership certified that ITER's vacuum components can withstand major seismic events. Making sure the ITER tokamak will be safe in th [...]

    Read more

  • Feeders | Delivering the essentials

    Like a circle of giant syringes all pointing inward, the feeders transport and deliver the essentials to the 10,000-tonne ITER magnet system—that is, electrical [...]

    Read more

  • Image of the week | It's FAB season

    It's FAB season at ITER. Like every year since 2008, the Financial Audit Board (FAB) will proceed with a meticulous audit of the project's finances, siftin [...]

    Read more

  • Disruption mitigation | Final design review is a major step forward

    The generations of physicists, engineers, technicians and other specialists who have worked in nuclear fusion share a common goal, dedication and responsibility [...]

    Read more

  • Image of the week | Like grasping a bowl of cereal

    Contrary to the vast majority of ITER machine components, the modules that form the central solenoid cannot be lifted by way of hooks and attachments. The 110-t [...]

    Read more

Of Interest

See archived entries

ITER Scientific Data Centre

Partnerships help prepare for operation

ITER has been nurturing partnerships with big tech companies and the larger scientific community to help store, distribute and analyze all the data that will be produced from experiments.

ITER's future Scientific Data and Computer Centre will store, secure, process and distribute the vast amounts of scientific data produced during experiments. (This is a stock image used for illustrative purposes.) To meet the target of 99.99% availability—including to Members offsite—the ITER Organization will rely on outside partners for some of its storage and data accessibility needs. (Click to view larger version...)
ITER's future Scientific Data and Computer Centre will store, secure, process and distribute the vast amounts of scientific data produced during experiments. (This is a stock image used for illustrative purposes.) To meet the target of 99.99% availability—including to Members offsite—the ITER Organization will rely on outside partners for some of its storage and data accessibility needs.
"In the end, what the Member states are paying for is the data," says Peter Kroul, Computing Center Officer. The job at the ITER Scientific Data and Computer Centre (SDCC) is to deliver on that commitment—storing, securing, processing and distributing the vast amounts of scientific data produced throughout the lifetime of the project. But the Data Centre will not do it alone; it will be assisted by partners with deep experience in managing and sharing very high volumes of information.

ITER data management challenges are comparable to those of CERN, synchrotrons, telescopes and other large scientific installations—one of which is that at least one copy of all data generated during experiments will be stored onsite. Since researchers will need to analyze data across different experimental phases, it must be possible to quickly compare the results of the latest pulses with some of the earlier ones. This means very fast data access at any time, to any scientific data ever produced over the lifetime of the project.

"We have worked with IBM and partner B4Restore since 2020 to run a proof of concept of the long-term data storage and high-performance storage," says Kroul. "We get access to their technological road maps to foresee, for instance, how storage technology is advancing, so we can better forecast the systems and space needed for ITER's Scientific Data and Computing Centre. As an important partner, we have also benefitted from testing some of their latest systems before they are released to the market."

"Knowing where IBM and other companies will be in the next few years helps us predict how we'll be able to squeeze a growing amount of data into our limited facility. A discipline like capacity management at some point will become very essential to the daily operation of the Data Centre. We need to get used to frequently adding capacity, removing outdated storage systems and replacing them with the latest technology—and we have to manage this while in operation and without downtime."

The current ITER high performance computer with more than 300 physical servers and 9000 compute cores. (Click to view larger version...)
The current ITER high performance computer with more than 300 physical servers and 9000 compute cores.


Offsite storage and distribution

ITER's Scientific Data and Computing Centre must guarantee 99.99% availability, which means downtime must be under one hour per year. To support this stringent requirement, at least one extra copy of all data will be stored offsite at a fast-retrieval distribution centre to ensure each Member state gets immediate access to data they request. That infrastructure is being constructed in a data centre in Marseille and is expected to be fully operational by mid-2024. Two geographically separated fibre optic links will connect the distribution centre to the ITER site, with one set of cables serving as a hot standby. Another redundant pair of cabling systems will connect it to the research network backbone funded by the European Union.

"We have coordination meetings with other organizations from the ITER Members because we're using the same research networks that constitute the backbone of the scientific internet," says David Fernandez, Leader of the IT System & Operation Section.

The distribution centre will be a hub for all continental and intercontinental data traffic but also for all the cloud providers, which will host some applications and possibly provide extra computational power as needed.

"A year ago, we finalized the first test of integrating our on-site computing clusters with both Google Cloud and with Microsoft Azure," says Kroul. "And that was a successful test, meaning we managed to seamlessly integrate our on-site facility directly into these cloud operators so that we could offload some of the computational jobs to services off site—and do so in a manner that is transparent to the scientists. We did this with both Google and Microsoft, and it was very impressive. The speed was almost the same as if the service were on site—and sometimes faster—even though we had to send the job to Google or Microsoft in the cloud, spin up the resources and then get the call back. With Google we ran several important large computations using over 5,000 cloud-based cores, which saved us months of onsite resources and work."

While the cloud comes as an incremental cost, it is convenient and easy to use on an as-need basis to provide a hybrid burst capacity for onsite computation jobs. If the load is too high and researchers don't have time to wait for high-performance computing resources to become available on site, the job can be off-loaded to the cloud.

Outside of ITER Headquarters, cooling and electrical equipment has been installed for the ITER Scientific Data and Computer Centre, including this extra fuel tank for extended generator operation. (Click to view larger version...)
Outside of ITER Headquarters, cooling and electrical equipment has been installed for the ITER Scientific Data and Computer Centre, including this extra fuel tank for extended generator operation.


Quick retrieval and deep analysis

A data rate of at least 50 gigabyte per second is expected during full deuterium-tritium operation at ITER. But that may grow even higher because as sensors and cameras become more advanced, they will generate much more data than what was predicted at the initial phases of the project. On the retrieval side, the data rate must be at least the same as the rate at which it is stored.

"When we get the connectivity to Marseille, we can start performing data challenges," says Fernandez. "These will be tests to demonstrate the feasibility of data replication to the offsite data centre within the timeframe requirements. Similarly, when we have connectivity into the international research networks at a high speed, transatlantic data challenges will also be attempted. These tests will be run with several partners. As of today, this includes ESnet [the Energy Sciences Network] and the United States Domestic Agency US ITER."

Depending on the queries scientists want to make, it might be necessary to retrieve data from different sources. To enable that kind of operation, the right software needs to be deployed and the data needs to be appropriately structured so that, for example, a query does not require opening a thousand different files simultaneously. The infrastructure has to perform well enough to support these dispersed retrievals without creating bottlenecks.

Finally, ITER is keeping an eye on how artificial intelligence (AI) can be used for data analysis. AI is still relatively new and the need for intensive analysis is still a few years off, so no commitments have been made yet. However, the group in charge of ITER's Scientific Data and Computing Centre has already begun discussions with big tech companies to see how AI software and hardware might be used.

"To give you an example, we have been talking with Google and NVIDIA about how AI and Machine Learning could help us manage and analyze data," says Kroul. "It looks very promising."



return to the latest published articles