This project explores autonomous navigation using the ROSbot 2R, implementing two distinct path-planning approaches to efficiently reach a target destination while avoiding obstacles. The goal was to measure total travel time and distance covered while ensuring smooth navigation in both simulated and real-world environments.
- The robot follows a predefined set of waypoints, optimizing the path to reach the destination efficiently.
- This approach enables flexible route planning and smooth movement across multiple checkpoints.
- It is particularly effective for structured environments where obstacle placement is predictable.
- The robot directly moves toward a single target coordinate while actively avoiding obstacles.
- Suitable for dynamic environments where predefining waypoints is impractical.
- Provides a more adaptive response to unexpected obstacles and changes in surroundings.
The project was implemented in simulation using Gazebo and RViz, followed by real-world deployment on the ROSbot 2R.
To achieve robust navigation, the robot utilized:
- LiDAR β Distance measuring and obstacle detection.
- Time-of-Flight (ToF) Sensors (FL & FR) β Short-range obstacle avoidance.
- IMU β Orientation and motion tracking.
- Wheel Encoders β Precise odometry data.
- Camera β Used for visualization purposes only.
- Both algorithms were successfully tested in Gazebo.
- The robot consistently reached the goal without collisions, efficiently selecting routes based on sensor data.
- Waypoint-based navigation proved effective in structured layouts, while single-coordinate navigation adapted better to dynamic environments.
- The system was successfully transferred to a physical ROSbot 2R.
- Real-world performance aligned closely with simulation, demonstrating robust obstacle avoidance and smooth navigation.
- The results validated the sensor fusion and path-planning techniques implemented.
- Multi-waypoint navigation is better for predefined paths, while single-coordinate navigation offers adaptability.
- Simulation closely matched real-world results, showcasing the reliability of the approach.
- Sensor fusion played a crucial role in precise movement and real-time decision-making.
π‘ Next Steps: Expanding the project to incorporate dynamic obstacle avoidance using deep learning and refining real-world implementation.
π Stay tuned for updates!
