Learning the Ego-Motion of an Underwater Imaging Sonar: A Comparative Experimental Evaluation of Novel CNN and RCNN Approaches
Revista : IEEE Robotics and Automation LettersVolumen : 9
Número : 3
Páginas : 2072-2079
Tipo de publicación : ISI Ir a publicación
Abstract
This research addresses the challenge of estimating the ego-motion of a forward-looking sonar (FLS) through deep neural networks (DNNs) and their application in autonomous underwater robots. Over the last two decades, analytical methods have been developed to perform odometry estimation using FLS data. While these methods can be effective, they are often computationally intensive, complex to implement, or rely on simplifying assumptions restricting their widespread application. Inspired by works on the optical domain, we propose two novel deep-learning approaches to estimate the FLS ego-motion. The first approach employs a convolutional neural network (CNN) to estimate motion from FLS images directly. The second approach leverages sequential image information using a recurrent convolutional neural network (RCNN). We quantitatively evaluate their performance by training and testing on synthetic and field data. Results show that both methods can learn to estimate ego-motion on both data types and that including sequential information can improve performance. This letter presents the first usage of a RCNN for this task, advances toward real-world applications by fine-tuning the models based on field data, and the first quantitative evaluation of these kinds of methods for acoustic odometry using field data.