Extended Kalman Filter Simultaneous Localization and Mapping (EKF SLAM)

ROS2 | C++ | SLAM | Kalman Filter | Machine Learning | Kinematics | Path Planning

Project Overview

In this project, I developed a feature-based Extended Kalman Filter Simultaneous Localization and Mapping (EKF SLAM) algorithm from scratch (no external libraries!) for a TurtleBot3 robot using the ROS2 Iron and the C++ programming language. SLAM enables the TurtleBot3 to autonomously navigate its surroundings while simultaneously constructing a map of the environment in real-time.

The EKF SLAM algorithm utilizes data from multiple sensors, including wheel encoders and a 2D LiDAR, to continually estimate the robot’s pose. The LiDAR data is processed through a Landmark Detection and Unknown Data Association pipeline to locate trackable features (cylindrical obstacles) in the environment. 

EKF SLAM accurately estimates TurtleBot3 position even as Odometry Estimate Drifts

The video above shows EKF SLAM working onboard the Turtlebot. Despite the drift in odometry estimates, the algorithm corrects the pose estimate by incorporating information from the constructed map, ensuring accurate localization throughout the robot’s exploration of the environment. 

  • Blue Robot: Odometry Estimate of the Robot’s Position
  • Green Robot: EKF SLAM Estimate of the Robot’s Position
  • Green Obstacles: EKF SLAM estimate of the Map
  • Point Cloud: 2D LiDar Data
Extended Kalman Filter SLAM Pipeline

EKF SLAM with Landmark Detection and Unknown Data Association in Rviz

I also developed a simulation clone for testing the EKF SLAM pipeline in Rviz. The real TurtleBot3 is replaced by a simulated counterpart that can output similar sensor data (wheel encoder readings and 2D LiDAR measurements) required for SLAM. Wheel slip and sensor noise are also modeled to simulate unreliable state measurements. The video above shows a simulated TurtleBot3 colliding with an obstacle in the environment. The odometry estimate accumulates error post-collision, while the EKF SLAM estimate reflects the true pose of the robot.

  • Red Robot: Actual Robot Position (Sim clone for the real TurtleBo3)
  • Blue Robot: Odometry Estimate of the Robot’s Position
  • Green Robot: EKF SLAM Estimate of the Robot’s Position
  • Green Obstacles: EKF SLAM estimate of the Map
  • Yellow Point Cloud: 2D LiDar Data
Scroll to Top