Abstract:A simultaneous localization and mapping ( SLAM ) algorithm based on infrared enhanced vision / lidar / inertial combination is proposed to solve the problem that the visual sensor 's perceptual performance degradation leads to the failure of mapping and positioning in the dim or low texture environment of the mobile robot. The image fusion method of 2-D discrete wavelet transform is used to realize the feature-level fusion of visible image and infrared image, so as to improve the correlation effect between the front-end feature points of visual inertial SLAM algorithm and the 3-D point cloud data of lidar inertial SLAM. At the same time, it avoids matching too many irrelevant features and reduces the computational complexity and storage space requirements. The experimental results show that the algorithm in this paper is superior to the original algorithm in terms of robustness and accuracy in the dim and low texture environment.