OPTIMIZING SLAM ALGORITHMS FOR MOBILE ROBOTS

dc.contributor.authorToleubekova, Alma
dc.contributor.authorZhumagaliyeva, Aidana
dc.contributor.authorSmolyarchuk, Kir
dc.date.accessioned2025-06-12T11:53:43Z
dc.date.available2025-06-12T11:53:43Z
dc.date.issued2025-05-05
dc.description.abstractThe ability of mobile robots to autonomously navigate and interact with their surroundings is critical across various domains, including industrial automation, service robotics, and autonomous vehicles. A fundamental challenge in achieving autonomy lies in enabling robots to simultaneously construct a map of an unknown environment while localizing themselves within it — a problem known as Simultaneous Localization and Mapping (SLAM). Accurate environmental mapping is essential for tasks such as path planning, obstacle avoidance, and high-level decision-making, allowing robots to operate effectively in dynamic and unstructured settings. This research initially focused on enhancing SLAM performance by integrating 2D LiDAR (RPLiDAR A1) and a depth camera (Intel RealSense D435i) on a TurtleBot3 Burger within the ROS Noetic framework. By fusing LiDAR and visual depth data, we aimed to improve the accuracy and robustness of 3D point cloud-based environmental representations. A comparative analysis of SLAM methodologies—including FastSLAM, GraphSLAM, ORB-SLAM, LiDAR-based SLAM, and visual SLAM—was conducted to assess their suitability for real time mobile robot navigation. Building upon the SLAM framework, the second phase of this study focuses on sampling-based motion planning techniques, specifically Rapidly-exploring Random Trees (RRT) and its variants, to enable autonomous navigation within the constructed 3D map. To further evaluate the scalability and practical deployment of these algorithms, we are transitioning our research to a larger autonomous mobile robot platform equipped with enhanced sensing capabilities and computational resources. This work aims to develop a comprehensive mapping and navigation framework that optimizes both localization accuracy and motion efficiency for real-world robotic applications.
dc.identifier.citationToleubekova, A., Zhumagaliyeva, A., & Smolyarchuk, K. (2025). Optimizing SLAM algorithms for mobile robots. Nazarbayev University School of Engineering and Digital Sciences
dc.identifier.urihttps://nur.nu.edu.kz/handle/123456789/8911
dc.language.isoen
dc.publisherNazarbayev University School of Engineering and Digital Sciences
dc.rightsAttribution-ShareAlike 3.0 United Statesen
dc.rights.urihttp://creativecommons.org/licenses/by-sa/3.0/us/
dc.subjectSimultaneous Localization and Mapping (SLAM)
dc.subjectmobile robots
dc.subjectautonomous navigation
dc.subject2D LiDAR
dc.subjectdepth camera
dc.subjectROS Noetic
dc.subjectTurtleBot3
dc.subjectpoint cloud
dc.subjectmotion planning
dc.subjectRapidly-exploring Random Trees (RRT)
dc.subjectvisual SLAM
dc.subjectLiDAR-based SLAM
dc.subjectenvironmental mapping
dc.subjectrobot localization
dc.subjectpath planning
dc.subjecttype of access: open access
dc.titleOPTIMIZING SLAM ALGORITHMS FOR MOBILE ROBOTS
dc.typeBachelor's Capstone project

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Optimizing SLAM Algorithms for Mobile Robots.pdf
Size:
13.72 MB
Format:
Adobe Portable Document Format
Description:
Bachelor's Capstone project