Overview of Modern Approaches to Visual Odometry

  • Mikhail A. Terekhov Saint Petersburg State University, 28, Universitetsky pr., 198504, Stary Peterhof, Saint Petersburg, Russia
Keywords: visual odometry, SLAM, autonomous navigation, ADAS, UAV

Abstract

In this paper we describe the tasks of Visual Odometry and Simultaneous Localization and Mapping systems along with their main applications. Next, we list some approaches used by the scientific community to create such systems in different time periods. We then proceed to explain in detail the more recent method based on bundle adjustment and show some of its variations for different applications. At last, we overview present-day research directions in the field of visual odometry and briefly present our work.

Author Biography

Mikhail A. Terekhov, Saint Petersburg State University, 28, Universitetsky pr., 198504, Stary Peterhof, Saint Petersburg, Russia

Fourth year bachelor student, Mathematics and Mechanics faculty, Saint Petersburg State University, st054464@student.spbu.ru

References

Google cartographer. [Online]. Available: https://github.com/googlecartographer/cartographer

“Multicamera dso,” GitHub. [Online]. Available: https://github.com/MikhailTerekhov/mdso

Y. Almalioglu, M. R. U. Saputra, P. P. B. de Gusmão, A. Markham, and N. Trigoni, “GANVO: Unsupervised Deep Monocular Visual Odometry and Depth Estimation with Generative Adversarial Networks,” in 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 2019, pp. 5474–5480; doi: 10.1109/ICRA.2019.8793512

J. Engel, V. Koltun, and D. Cremers, “Direct sparse odometry,” IEEE transactions on pattern analysis and machine intelligence, vol. 40, no. 3, pp. 611–625, 2018; doi: 10.1109/TPAMI.2017.2658577

J. Engel, T. Schöps, and D. Cremers, “Lsd-slam: Large-scale direct monocular slam,” in Computer Vision – ECCV 2014, Zurich, Switzerland, pp. 834–849, 2014; doi: 10.1007/978-3-319-10605-2_54

J. Engel, T. Schöps, and D. Cremers, “Large-scale direct SLAM with stereo cameras,” in 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 2015, pp. 1935–1942; doi: 10.1109/IROS.2015.7353631

J.-M. Frahm et al., “Building Rome on a cloudless day,” in Computer Vision – ECCV 2010, Heraklion, Crete, Greece, 2010, pp. 368–381; doi: 10.1007/978-3-642-15561-1_27

R. Hartley and A. Zisserman, Multiple view geometry in computer vision, Cambridge, UK: Cambridge university press, 2003.

J. Heinly, J. L. Schonberger, E. Dunn, and J.-M. Frahm, “Reconstructing the world* in six days *(as captured by the yahoo 100 million image dataset),” in The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, 2015, pp. 3287–3295.

G. Klein and D. Murray, “Parallel tracking and mapping for small ar workspaces,” in Proc. of the 2007 6th IEEE and ACM Int. Symp. on Mixed and Augmented Reality, Washington, DC, 2007, pp. 1–10; doi: 10.1109/ISMAR.2007.4538852

T. Lacey, “Tutorial: The kalman filter,” MIT. [Online]. Available: http://web.mit.edu/kirtley/kirtley/binlustuff/literature/control/Kalman%20filter.pdf

H. Matsuki, L. von Stumberg, V. Usenko, J. Stuckler, and D. Cremers, ¨ “Omnidirectional DSO: Direct Sparse Odometry With Fisheye Cameras,” IEEE Robotics and Automation Letters, vol. 3, no. 4, pp. 3693–3700, 2018; doi: 10.1109/LRA.2018.2855443

A. I. Mourikis and S. I. Roumeliotis, “A multi-state constraint kalman filter for vision-aided inertial navigation,” in Proc. 2007 IEEE Int. Conf. on Robotics and Automation, Roma, Italy, 2007, pp. 3565–3572; doi: 10.1109/ROBOT.2007.364024

R. Mur-Artal, J. M. M. Montiel, and J. D. Tard´os, “Orb-slam: A versatile and accurate monocular slam system,” Transactions on Robotics, vol. 31, no. 5, pp. 1147–1163, 2015; doi: 10.1109/TRO.2015.2463671

R. Mur-Artal and J. D. Tardós, “ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras,” Transactions on Robotics, vol. 33, no. 5, pp. 1255–1262, 2017; doi: 10.1109/TRO.2017.2705103

R. A. Newcombe, S. J. Lovegrove, and A. J. Davison, “DTAM: Dense tracking and mapping in realtime,” in 2011 International Conference on Computer Vision, Barcelona, Spain, 2011, pp. 2320–2327; doi: 10.1109/ICCV.2011.6126513

E. Rublee, V. Rabaud, K. Konolige, and G. Bradski, “Orb: An efficient alternative to sift or surf,” in European Conf. on Computer Vision, 2011, pp. 2564–2571.

H. Strasdata, J. M. M. Montiel, and A. J. Davisona, “Visual slam: why filter?” Image and Vision Computing, vol. 30, no. 2, pp. 65–77, 2012; doi: 10.1016/j.imavis.2012.02.009

K. Tateno, F. Tombari, I. Laina, and N. Navab, “Cnn-slam: Real-time dense monocular slam with learned depth prediction,” in Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition, Hawaii, USA, 2017, pp. 6243–6252.

M. J. Tribou, A. Harmat, D. W. L. Wang, I. Sharf, and S. L. Waslander, “Multi-camera parallel tracking and mapping with non-overlapping fields of view,” The International Journal of Robotics Research, vol. 34, no. 12, pp. 1480–1500, 2015; doi: 10.1177/0278364915571429

S. Urban and S. Hinz, “Multicol-slam-a modular real-time multi-camera slam system,” in arXiv preprint, arXiv:1610.07336, 2016. [Online]. Available: https://arxiv.org/abs/1610.07336

R. Wang, M. Schworer, and D. Cremers, “Stereo dso: Large-scale direct sparse visual odometry with stereo cameras,” in Proceedings of the IEEE International Conference on Computer Vision, 2017, pp. 3903–3911.

N. Yang, R. Wang, J. Stuckler, and D. Cremers, “Deep virtual stereo odometry: Leveraging deep depth prediction for monocular direct sparse odometry,” in Proceedings of the European Conference on Computer Vision (ECCV), 2018, pp. 817–833.

Published
2019-09-30
How to Cite
Terekhov, M. A. (2019). Overview of Modern Approaches to Visual Odometry. Computer Tools in Education, (3), 5-14. https://doi.org/10.32603/2071-2340-2019-3-5-14
Section
Software Engineering