Inertial-Visual Pose Tracking Using Optical Flowaided Particle Filtering
This paper proposes an algorithm for visual-inertial camera pose tracking, using adaptive recursive particle filtering. The method benefits from the agility of inertial-based and robustness of vision-based tracking. A proposal distribution has been developed for the selection of the particles, which takes into account the characteristics of the Inertial Measurement Unit (IMU) and the motion kinematics of the moving camera. A set of state-space equations are formulated, particles are selected and then evaluated using the corresponding features tracked by optical flow. The system state is estimated using the weighted particles through an iterative sequential importance resampling algorithm. For the particle assessment, epipolar geometry, and the characteristics of focus of expansion (FoE) are considered. In the proposed system the computational cost is reduced by excluding the rotation matrix from the process of recursive state estimations. This system implements an intelligent decision making process, which decides on the best source of tracking whether IMU only, hybrid only or hybrid with past state correction. The results show a stable tracking performance with an average location error of a few centimeters in 3D space.
Citation:Moemeni, A. and Tatham, E. (2014) Inertial-Visual Pose Tracking Using Optical Flow-aided Particle Filtering. IEEE Symposium Series on Computational Intelligence (IEEE SSCI 2014)
Research Group:Centre for Computational Intelligence