Remote handling is never easy, but in the ITER machine it will be particularly difficult. Narrow entry ports, space constraints, poor visual contrast between the different components, limited options for camera placement ... all will combine in ITER to create an exceptionally demanding environment.
"In remote handling, lighting and viewing are vital ingredients," explains David Hamilton, the engineer in charge of the remote handling control systems at ITER. "And yet during ITER machine maintenance, camera placement will be very limited and visual obstacles will be everywhere. As for lighting, we will have to bring in our own sources, which will also be quite limiting."
Although ITER will not be the first tokamak to rely on remote handling, the machine's characteristics generate some unique challenges. "In JET for instance, a 300-kilo antenna is considered an exceptional load to handle. In ITER, we go up to 40 tonnes ..."
3D models and virtual reality can help solve some of the difficulties—both are quite useful for having an overview of the environment. "But there's always an error margin," says David. "You can't trust them for the last 20 or 50 millimetres because, after a certain period of operation, the machine's components will have moved and shifted slightly from the position recorded in the model."
A new innovative technique called "synthetic viewing," although not completely mature, looks like a promising alternative. "Synthetic viewing is based on the combination of data from the model and of data acquired and updated in real time by sensors, like cameras. It allows you to generate your own version of the view from an optimal angle and with optimal contrast and lighting. Based on the data stored in the model, you can 'see through' the obstacles that are blocking the camera."
In Holland, ITER NL
—a consortium of Dutch laboratories and industry established in 2007 to promote participation in ITER—had done some exploration in this direction. In August 2012, the consortium was commissioned by the ITER Organization to assess the feasibility of synthetic viewing and to develop a system prototype. In February, this prototype was successfully demonstrated at the Petten nuclear research centre (see video).
"Synthetic viewing is still a speculative technology," warns Hamilton. "For the moment, ITER is an end-consumer with an interest in the area... However, it is important to stimulate research because we will need such a system in ten years' time."
The largest obstacle that stands on the way to a perfectly efficient synthetic viewing system is computer power. "The problem is data calculation. A one-tenth of a second delay between what you capture on the viewing system and what is actually happening with the remote-handling device is the maximum tolerable. For the moment, computer systems are too slow to do object recognition and accurate localization within this time delay. But I'm confident these possibilities will evolve along with the calculation algorithms..."
Synthetic viewing systems of the future could also support more radiation-tolerant and lower cost acquisition devices, such as ultrasonic sensors.
Now that a prototype and an impressive self-explanatory video have been produced, the next step for Hamilton and the remote handling specialists at ITER is to "stir up interest in synthetic viewing" amongst the ITER European and Japanese Domestic Agencies who will procure the machine's remote-handling systems and components. "I'd like to think," says Hamilton, "that by acting together we'll be able to fund more research. There's a huge market there ..."
For ITER, the stakes are considerable. A swift, precise and reliable remote-handling system will reduce the length of the machine shut-down phases and largely contribute to optimizing overall operation costs.
Watch the ITER NL video on synthetic viewing here.