Subscribe options

Select your newsletters:


Please enter your email address:

@

News & Media

Latest ITER Newsline

  • A world in itself

    From a height of some 50 metres, you have the entire ITER worksite at your feet. The long rectangle of the Diagnostics Building stands out in the centre, with [...]

    Read more

  • US completes toroidal field deliveries for ITER

    The US Domestic Agency achieved a major milestone in February by completing the delivery of all US-supplied toroidal field conductor to the European toroidal fi [...]

    Read more

  • Thin diagnostic coils to be fitted into giant magnets

    Last week was marked by the first delivery of diagnostic components—Continuous External Rogowski (CER) coils—from the European Domestic Agency to the ITER Organ [...]

    Read more

  • Addressing the challenge of plasma disruptions

    Plasma disruptions are fast events in tokamak plasmas that lead to the complete loss of the thermal and magnetic energy stored in the plasma. The plasma control [...]

    Read more

  • Blending (almost) seamlessly into the landscape

    Located in the foothills of the French Pre-Alps, the ITER installation blends almost seamlessly into the landscape. The architects' choice ofmirror-like steel c [...]

    Read more

Of Interest

See archived articles

How to handle the Petabytes

-Sabina Griffith

 (Click to view larger version...)
When it was announced in 1985 that the American "Cray-2" supercomputer had achieved a capacity of one Gigaflop per second, even some scientists had to consult the dictionary. The term Giga is derived from the Greek—meaning giant—and is the abbreviation for one billion. A Gigaflop computer can perform one billion floating-point operations (Flop) per second.

In 1985, this was one thousand fold the capacity achievable with your home computer. Today, every mobile phone contains a Gigaflop processor. And while the "big bang" hunters at CERN are dealing with Petaflops (1015 calculations per second), the new kid on the large science block, the Square Kilometer Array (SKA) which will be built in south Africa and Australia, will require supercomputers that can digest data on the Exa scale. That is a 1 followed by 18 zeros.

The steep increase of computer memory known as Moore's Law is comparable to the performance of magnetic fusion devices ... and to their generation of data. Since the first plasma pulse on JET in 1983, the raw data collected during each discharge has roughly doubled every two years. Today, about 10 Gigabyte of data is collected per each 40 second pulse; the data collected over 70,000 JET pulses amounts to roughly 35 Terabytes.

When ITER starts operation, the data generated will again reach new dimensions. Each plasma discharge—lasting 300 to 3000 seconds—will generate an estimated tens of Gigabytes per second, leading to a total of a few hundred Petabytes per year. And is not only the storage and archiving of the huge amount of data that poses a challenge, but also its accessibility in real-time.

In a recent workshop organized by Lana Abadie, responsible for the scientific archiving system within the CODAC team, the challenge of storing and accessing the flood of scientific data was addressed by experts from many different institutes and backgrounds.

"We need to store this data almost real-time to allow physicists to start their analysis code in order to allow calculations for the next pulses," explains Lana. "This data is what we call raw data, i.e., data coming from the ITER machine unfiltered. The main producers will be the various diagnostics systems. Then we need to store processed and simulated data. Different physics applications will use raw data and process them. This output needs to be stored too—and made accessible."

In other words, raw, processed, and simulated data will be accessed in the same way. But accessing the data in an efficient way is not an easy task. "Imagine you have a pile of 20,000,000 Ipods of 16GB—equivalent to the yearly production of all types of ITER data. Let's say you are looking for a song that was produced last February, but you don't even know the exact title. You remember that it was something like 'I follow' and that it was a remix of an earlier song by the same artist. Of course, you could spend quite a few hours finding the song. The challenge for CODAC is to provide data access within a few seconds. It is very important to understand the different archiving techniques and to stay abreast of upcoming technologies in that area."
 
The CODAC archiving system has to be ready for First Plasma with a well-proven scalability. The data will be stored first in the CODAC server room and will then be streamed to the IT computing centre. CODAC will develop a first prototype within the next two years. The team is currently studying a system based on HDF5, a well-known scientific data format used by many institutions such as NASA. HDF5 allows the storage of all types of data and corresponding metadata.


return to the latest published articles