REFERENCES

1. Taketomi T, Uchiyama H, Ikeda S. Visual SLAM algorithms: a survey from 2010 to 2016. IPSJ Trans Comput Vis Appl 2017;9:16.

2. Labbé M, Michaud F. Rtab-Map as an open-source lidar and visual simultaneous localization and mapping library for large-scale and long-term online operation. J Field Robot 2018;36:416-46.

3. Jiao J, Zhu Y, Ye H, et al. Greedy-based feature selection for efficient LiDAR SLAM. In: 2021 IEEE International Conference on Robotics and Automation (ICRA); 2021, pp. 5222-8.

4. Zhan J, Sing S. LOAM: lidar odometry and mapping in real-time. In: Robotics: science and system. 2014. Available from: https://api.semanticscholar.org/CorpusID:18612391 [Last accessed on 26 Jun 2024].

5. Lo KL. Linear least-squares optimization for point-to-plane ICP surface registration. 2004. Available from: https://api.semanticscholar.org/CorpusID:122873316 [Last accessed on 26 Jun 2024].

6. Sha T, Englo B. LeGO-LOAM: lightweight and ground-optimized lidar odome-try and mapping on variable terrain. In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS); 2018, pp. 4758-65.38257644PMC11154502.

7. Deschau JE. IMLS-SLAM: scan-to-model matching based on 3D data. In: 2018 IEEE International Conference on Robotics and Automation (ICRA); 2018, pp. 2480-5.

8. Pa Y, Xiao P, He Y, Shao Z, Li Z. MULLS: versatile LiDAR SLAM via multi-metric linear least square. In: 2021 IEEE International Conference on Robotics and Automation (ICRA); 2021, pp. 11633-40.

9. Weiss S, Achtelik MW, Lynen S, Chli M, Siegwart R. Real-time onboard visual-inertial state estimation and self-calibration of mavs in unknown environments. In: 2012 IEEE International Conference on Robotics and Automatio; 2012, pp. 957-64.

10. Mouriki AI, Roumelioti SI. A multi-state constraint kalman filter for vision-aided inertial navigation. In: Proceedings 2007 IEEE International Conference on Robotics and Automatio; 2007, pp. 3565-72.32455697PMC7288036.

11. Lupto T, Sukkarie S. Visual-inertial-aided navigation for high-dynamic motion in built environments without initial conditions. IEEE Trans Robot 2012;28:61-76.

12. Shen S, Michael N, Kumar V. Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs. In: 2015 IEEE International Conference on Robotics and Automation (ICRA); 2015, pp. 5303-10.

13. Forster C, Carlone L, Dellaert F, Scaramuzza D. On-manifold preintegration for real-time visual-inertial odometry. IEEE Trans Robot 2017;33:1-21.

14. Forster C, Carlone L, Dellaert F, Scaramuzza D. IMU preintegration on manifold for efficient visual-inertial maximum-a-posteriori estimation. 2015.

15. Qin C, Ye H, Pranata CE, Han J, Zhang S, Liu M. LINS: a lidar-inertial state estimator for robust and efficient navigation. In: 2020 IEEE International Conference on Robotics and Automation (ICRA); 2020, pp. 8899-906.

16. Shan T, Englot B, Meyers D, Wang W, Ratti C, Rus D. LIO-SAM: tightly-coupled lidar inertial odometry via smoothing and mapping. In: 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS); 2020, pp. 5135-42.34879118PMC8654169.

17. Lin J, Zhang F. Loam livox: a fas, robus, high-precision lidar odometry and mapping package for lidars of small fov. In: 2020 IEEE International Conference on Robotics and Automation (ICRA); 2020, pp. 3126-31.

18. Xu W, Zhang F. FAST-LIO: a fas, robust lidar-inertial odometry package by tightly-coupled iterated kalman filter. IEEE Robot Autom Lett 2021;6:3317-24.

19. Xu W, Cai Y, He D, Lin J, Zhang F. FAST-LIO2:fast direct lidar-inertial odometry. IEEE Trans Robot 2022;38:2053-73.

20. Zhu Y, Zheng C, Yuan C, Huang X, Hong X. CamVox: a low-cost and accurate lidar-assisted visual slam system. In: 2021 IEEE International Conference on Robotics and Automation (ICRA); 2021, pp. 5049-55.

21. Li K, Li M, Hanebeck UD. Towards high-performance solid-state-lidar-inertial odometry and mapping. IEEE Robot Autom Lett 2021;6:5167-74.

22. Zhang J, Kaess M, Singh S. On degeneracy of optimization-based state estimation problems. In: 2016 IEEE International Conference on Robotics and Automation (ICRA); 2016, pp. 809-16.

23. Zhao Y, Huang K, Lu H, Xiao J. Extrinsic calibration of a small FoV LiDAR and a camera. In: 2020 Chinese Automation Congress (CAC); 2020, pp. 3915-20.

24. Shan T, Englot B, Duarte F, Ratti C, Rus D. Robust place recognition using an imaging lidar. In: 2021 IEEE International Conference on Robotics and Automation (ICRA); 2021, pp. 5469-75.

25. L, H, Tia B, She H, L, J. An intensity-augmented LiDAR-inertial SLAM for solid-state LiDARs in degenerated environments. IEEE Trans Instrum Meas 2022;71:1-10.

26. Chen X, Milioto A, Palazzolo E, Giguère P, Behley J, Stachniss C. Suma++: efficient LiDAR-based semantic SLAM. In: 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS); 2019, pp. 4530-4537.

27. Chen G, Wang B, Wang X, Deng H, Wang B, Zhang S. PSF-LO: parameterized semantic features based lidar odometry. In: 2021 IEEE International Conference on Robotics and Automation (ICRA); 2021, pp. 5056-62.

28. Zhao C, Li J, Chen A, Lyu Y, Hua L. Intensity augmented solid-state-LiDAR-inertial SLAM. In: Proceedings of 3rd 2023 International Conference on Autonomous Unmanned Systems (3rd ICAUS 2023); pp. 129-39. Available from: https://link.springer.com/chapter/10.1007/978-981-97-1103-1_12 [Last accessed on 26 Jun 2024]35877632PMC9323171.

Complex Engineering Systems
ISSN 2770-6249 (Online)

Portico

All published articles are preserved here permanently:

https://www.portico.org/publishers/oae/

Portico

All published articles are preserved here permanently:

https://www.portico.org/publishers/oae/