From plain visualisation to vibration sensing: using a camera to control the flexibilities in the ITER remote handling equipment
Research output: Book/Report › Doctoral thesis › Monograph
|Place of Publication||Tampere|
|Publisher||Tampere University of Technology|
|Number of pages||193|
|Publication status||Published - 10 Oct 2014|
|Publication type||G4 Doctoral dissertation (monograph)|
|Name||Tampere University of Techology. Publication|
|Publisher||Tampere University of Technology|
Thermonuclear fusion is expected to play a key role in the energy market during the second half of this century, reaching 20% of the electricity generation by 2100. For many years, fusion scientists and engineers have been developing the various technologies required to build nuclear power stations allowing a sustained fusion reaction. To the maximum possible extent, maintenance operations in fusion reactors are performed manually by qualified workers in full accordance with the "as low as reasonably achievable" (ALARA) principle. However, the option of hands-on maintenance becomes impractical, difficult or simply impossible in many circumstances, such as high biological dose rates. In this case, maintenance tasks will be performed with remote handling (RH) techniques.
The International Thermonuclear Experimental Reactor ITER, to be commissioned in southern France around 2025, will be the first fusion experiment producing more power from fusion than energy necessary to heat the plasma. Its main objective is “to demonstrate the scientific and technological feasibility of fusion power for peaceful purposes”. However ITER represents an unequalled challenge in terms of RH system design, since it will be much more demanding and complex than any other remote maintenance system previously designed.
The introduction of man-in-the-loop capabilities in the robotic systems designed for ITER maintenance would provide useful assistance during inspection, i.e. by providing the operator the ability and flexibility to locate and examine unplanned targets, or during handling operations, i.e. by making peg-in-hole tasks easier. Unfortunately, most transmission technologies able to withstand the very specific and extreme environmental conditions existing inside a fusion reactor are based on gears, screws, cables and chains, which make the whole system very flexible and subject to vibrations. This effect is further increased as structural parts of the maintenance equipment are generally lightweight and slender structures due to the size and the arduous accessibility to the reactor.
Several methodologies aiming at avoiding or limiting the effects of vibrations on RH system performance have been investigated over the past decade. These methods often rely on the use of vibration sensors such as accelerometers. However, reviewing market shows that there is no commercial off-the-shelf (COTS) accelerometer that meets the very specific requirements for vibration sensing in the ITER in-vessel RH equipment (resilience to high total integrated dose, high sensitivity). The customisation and qualification of existing products or investigation of new concepts might be considered. However, these options would inevitably involve high development costs.
While an extensive amount of work has been published on the modelling and control of flexible manipulators in the 1980s and 1990s, the possibility to use vision devices to stabilise an oscillating robotic arm has only been considered very recently and this promising solution has not been discussed at length. In parallel, recent developments on machine vision systems in nuclear environment have been very encouraging. Although they do not deal directly with vibration sensing, they open up new prospects in the use of radiation tolerant cameras.
This thesis aims to demonstrate that vibration control of remote maintenance equipment operating in harsh environments such as ITER can be achieved without considering any extra sensor besides the embarked rad-hardened cameras that will inevitably be used to provide real-time visual feedback to the operators. In other words it is proposed to consider the radiation-tolerant vision devices as full sensors providing quantitative data that can be processed by the control scheme and not only as plain video feedback providing qualitative information. The work conducted within the present thesis has confirmed that methods based on the tracking of visual features from an unknown environment are effective candidates for the real-time control of vibrations. Oscillations induced at the end effector are estimated by exploiting a simple physical model of the manipulator. Using a camera mounted in an eye-in-hand configuration, this model is adjusted using direct measurement of the tip oscillations with respect to the static environment.
The primary contribution of this thesis consists of implementing a markerless tracker to determine the velocity of a tip-mounted camera in an untrimmed environment in order to stabilise an oscillating long-reach robotic arm. In particular, this method implies modifying an existing online interaction matrix estimator to make it self-adjustable and deriving a multimode dynamic model of a flexible rotating beam. An innovative vision-based method using sinusoidal regression to sense low-frequency oscillations is also proposed and tested. Finally, the problem of online estimation of the image capture delay for visual servoing applications with high dynamics is addressed and an original approach based on the concept of cross-correlation is presented and experimentally validated.