In this work we present a method for fusion of direct radiometric data from a thermal camera with inertial measurements to enable pose estimation of aerial robots thermal cameras such as those operating in the long-wave IR range are not affected by the lack of illumination and the presence of obscure ins such as fog and dust these characteristics make them.

A suitable choice for robot navigation operations in GPS denied completely dark and obscure infilled environments in contrast to previous approaches which use 8-bit rescaled thermal imagery as a complimentary sensing modality to visual image data our approach makes use of the full 14 bit radiometric data making a generalizable to a variety of environments without.

The need of heuristic tuning furthermore our approach implements a key frame based joint optimization scheme making odometry estimates robust against image data interruption which is common during the operation of thermal cameras due to the application of flat field Corrections our experiments in a completely dark indoor environment demonstrate the reliability of our approach by comparing our.

Estimated odometry against the ground truth provided by a vikon system to put our results into perspective and due to the limited literature in thermal vision fusion we compare our method with state-of-the-art visual and visual inertial odometry approaches thus demonstrating the efficacy of our solution and the benefits of utilizing full radiometric information.

We also demonstrate the reliable performance of our approach and of real-world application by estimating the pose of an aerial robot navigating through an underground mine and conditions of darkness and in the presence of heavy airborne dust you.

LEAVE A REPLY

Please enter your comment!
Please enter your name here