AUTONOMOUS WHEELCHAIR NAVIGATION SYSTEM USING SLAM AND VISION-BASED CONTROL
Loading...
Date
Journal Title
Journal ISSN
Volume Title
Publisher
Nazarbayev University School of Engineering and Digital Sciences
Abstract
This project focuses on enhancing the functionality of a power-assisted wheelchair by in- tegrating autonomous navigation capabilities using a Raspberry Pi microcomputer, ZED 2 depth camera, and UM7-LT IMU sensor. The system leverages Simultaneous Localization and Mapping (SLAM) through the RTAB-Map framework, utilizing the ROS2 Navigation Stack for real-time obstacle avoidance and efficient path planning. A CAN-bus interface has been established between the control computer and the wheelchair, allowing precise control via terminal commands and enabling autonomous navigation in complex environments.
Progress includes completing a communication setup using a Raspberry Pi with CAN2RNET for direct wheelchair control, including commands for movement and feedback using an Xbox joystick and ROS Twist messages sent from the laptop. Additionally, SolidWorks designs were finalized to securely mount the ZED 2 camera and IMU sensor on the wheelchair. This ensures stable data acquisition, critical for accurate SLAM and navigation.
Future work will focus on integrating sensor data into ROS2 for enhanced navigation. By refining navigation algorithms and creating a user-friendly interface, this project aims to improve mobility for individuals with disabilities, enhancing independence and safety in their daily lives.
Description
Keywords
Citation
Amangeldiyev, B., & Kaipiyev, A. (2025). Autonomous wheelchair navigation system using SLAM and vision-based control. Nazarbayev University School of Engineering and Digital Sciences
Collections
Endorsement
Review
Supplemented By
Referenced By
Creative Commons license
Except where otherwised noted, this item's license is described as Attribution-NonCommercial-NoDerivs 3.0 United States
