Subscribe options

Select your newsletters:

Please enter your email address:


News & Media

Latest ITER Newsline

  • Neighbours | In goes the antenna

    Just a short distance from the ITER site, the Institute for Magnetic Fusion Research (IRFM) is modifying the Tore Supra plasma facility which, once transformed, [...]

    Read more

  • Remote handling | Off-site test facility for design evaluation

    Through a technical collaboration established between the ITER Organization and the UK Atomic Energy Authority (UKAEA) in 2017, the UKAEA's centre for Remote Ap [...]

    Read more

  • Poloidal field coils | A tailor-made ring

    They work like tailors, carefully taking measurements and cutting immaculate fabric with large pairs of scissors. But they're not making a white three-piece sui [...]

    Read more

  • Fusion world | Record results at KSTAR

    Experiments in the Korean tokamakKSTAR in 2017 achieved record-length periods of ELM suppression by the application of three-dimensional magnetic fields with in [...]

    Read more

  • JT-60 SA| Cryostat ready for Europe-Japan tokamak

    The cryostat vessel body of the JT-60SA tokamakhas been successfully manufactured and pre-assembled at a factory in Spain, and will soon be transferred to the J [...]

    Read more

Of Interest

See archived articles

How to handle the Petabytes

Sabina Griffith

 (Click to view larger version...)
When it was announced in 1985 that the American "Cray-2" supercomputer had achieved a capacity of one Gigaflop per second, even some scientists had to consult the dictionary. The term Giga is derived from the Greek—meaning giant—and is the abbreviation for one billion. A Gigaflop computer can perform one billion floating-point operations (Flop) per second.

In 1985, this was one thousand fold the capacity achievable with your home computer. Today, every mobile phone contains a Gigaflop processor. And while the "big bang" hunters at CERN are dealing with Petaflops (1015 calculations per second), the new kid on the large science block, the Square Kilometer Array (SKA) which will be built in south Africa and Australia, will require supercomputers that can digest data on the Exa scale. That is a 1 followed by 18 zeros.

The steep increase of computer memory known as Moore's Law is comparable to the performance of magnetic fusion devices ... and to their generation of data. Since the first plasma pulse on JET in 1983, the raw data collected during each discharge has roughly doubled every two years. Today, about 10 Gigabyte of data is collected per each 40 second pulse; the data collected over 70,000 JET pulses amounts to roughly 35 Terabytes.

When ITER starts operation, the data generated will again reach new dimensions. Each plasma discharge—lasting 300 to 3000 seconds—will generate an estimated tens of Gigabytes per second, leading to a total of a few hundred Petabytes per year. And is not only the storage and archiving of the huge amount of data that poses a challenge, but also its accessibility in real-time.

In a recent workshop organized by Lana Abadie, responsible for the scientific archiving system within the CODAC team, the challenge of storing and accessing the flood of scientific data was addressed by experts from many different institutes and backgrounds.

"We need to store this data almost real-time to allow physicists to start their analysis code in order to allow calculations for the next pulses," explains Lana. "This data is what we call raw data, i.e., data coming from the ITER machine unfiltered. The main producers will be the various diagnostics systems. Then we need to store processed and simulated data. Different physics applications will use raw data and process them. This output needs to be stored too—and made accessible."

In other words, raw, processed, and simulated data will be accessed in the same way. But accessing the data in an efficient way is not an easy task. "Imagine you have a pile of 20,000,000 Ipods of 16GB—equivalent to the yearly production of all types of ITER data. Let's say you are looking for a song that was produced last February, but you don't even know the exact title. You remember that it was something like 'I follow' and that it was a remix of an earlier song by the same artist. Of course, you could spend quite a few hours finding the song. The challenge for CODAC is to provide data access within a few seconds. It is very important to understand the different archiving techniques and to stay abreast of upcoming technologies in that area."
The CODAC archiving system has to be ready for First Plasma with a well-proven scalability. The data will be stored first in the CODAC server room and will then be streamed to the IT computing centre. CODAC will develop a first prototype within the next two years. The team is currently studying a system based on HDF5, a well-known scientific data format used by many institutions such as NASA. HDF5 allows the storage of all types of data and corresponding metadata.

return to the latest published articles