ROS2-based simulation of a differential-drive robot with visual servoing and a snake robot (ACM-R5) in CoppeliaSim. Developed for the Mobile Ground Robots module in the undergraduate course "Intelligent Robotics Implementation."
To run this simulation, you will need ROS 2 Humble and CoppeliaSim v4.9.0 rev6 installed on your system.
Once both are installed, use the interface_setup script located in the root of this repository to build the CoppeliaSim ROS2 interface (sim_ros2_interface package).
Inside CoppeliaSim, go to File → Open Scene and load the desired .ttt file located in the scenes folder. This will load the full simulation environment, including the Lua and Python scripts attached to the objects, which you can also find in the scripts folder, such as:
simulation_time_publisher.lua
Publishes the current simulation time (in seconds) on the/simulationTimeROS2 topic using astd_msgs/msg/Float32message. Helps synchronize ROS2 nodes with the simulation clock.image_publisher.lua
Attached to the differential-drive robot's onboard vision sensor. Captures RGB images from the camera and publishes them to/camera/imageassensor_msgs/msg/Image.differential_drive.lua
Attached to the differential-drive robot. Implements inverse kinematics: it subscribes to/cmd_vel(linear and angular velocity commands), converts them into left and right wheel angular speeds, and applies them. It also publishes the actual measured wheel speeds to/VelocityEncLand/VelocityEncR.green_sphere_movement.py
Controls a green sphere in the simulation. Moves it back and forth along the Y-axis in a sinusoidal pattern. Used for visual servoing.
First, this project implements a visual servoing system where a differential-drive robot tracks and follows a green sphere using point-to-point PID control. A camera mounted on the robot detects the sphere through HSV color segmentation. The centroid of the sphere is used to compute a heading correction, from which a target waypoint is generated a fixed distance ahead. This waypoint is sent to a PID controller that computes the linear and angular velocities needed to drive the robot toward the goal using odometry feedback.
Here’s a quick breakdown of the involved files:
sim_diff_drive_vision/green_sphere_following.py
Detects the green sphere in camera images and publishes a 2D waypoint ahead of the robot.sim_diff_drive_control/localization.py
Computes the robot’s pose using dead reckoning from wheel speed data and simulation time.sim_diff_drive_control/point_controller.py
PID point-to-point controller that computes velocity commands (cmd_vel) based on the robot’s pose and goal.
Open the green_sphere_following.ttt file, start the simulation in CoppeliaSim, and after building all packages (except sim_ros2_interface), run the system using:
ros2 launch sim_diff_drive_bringup green_sphere_following.launch.py
Next, a Finite State Machine (FSM) approach was developed to allow the robot to navigate its environment through visual servoing with different behaviors based on colored sphere detection, and can be found on sim_diff_drive_behavior/local_trajectory_sphere_fsm.py.
- No Sphere: Robot stops and remains idle.
- Green Sphere: Robot moves forward toward the sphere.
- Orange Sphere: Robot turns right until it sees a green sphere again.
- Blue Sphere: Robot turns left until it sees a green sphere again.
- Yellow Sphere: Robot pauses all movement for 5 seconds.
The FSM implements the following states and actions:
NO_SPHERE_IDLE: Publishes an empty Twist message to stop the robot.
GREEN_SPHERE_FORWARD: Sends waypoint coordinates to the Point-to-Point PID controller, similar to the earlier green sphere following approach.
ORANGE_SPHERE_RIGHT: Directly publishes a Twist message with negative angular velocity to turn right.
BLUE_SPHERE_LEFT: Directly publishes a Twist message with positive angular velocity to turn left.
YELLOW_SPHERE_PAUSE: Publishes an empty Twist message and starts a timer for the pause duration.
The state transitions follow these rules:
From NO_SPHERE_IDLE: Transitions to any other state if the corresponding colored sphere is detected.
From GREEN_SPHERE_FORWARD: Can transition to any other state based on sphere detection.
From ORANGE_SPHERE_RIGHT or BLUE_SPHERE_LEFT: Transitions to GREEN_SPHERE_FORWARD if a green sphere is detected, or to NO_SPHERE_IDLE if no sphere is detected.
From YELLOW_SPHERE_PAUSE: After the pause duration (5 seconds), transitions to any state based on sphere detection.
This system utilizes a general sphere detection node sim_diff_drive_vision/sphere_detection.py that can recognize all configured color spheres, and publishes both the detected color and a 2D waypoint ahead of the robot in a custom SphereDetection.msg message.
If multiple colors are on the scene, the detection algorithm prioritizes green spheres for waypoint calculation, and if no green sphere is present, it uses the color with the largest area.
The color attribute of the message is used for state transitions by the FSM node.
For creating the local route, special spheres in the scene can change colors when the robot triggers proximity sensors. These interactive spheres use scripts to listen for sensor detection events and modify their color properties accordingly, allowing the robot to respond dynamically to its environment.
Open the local_path_FSM.ttt file, start the simulation in CoppeliaSim, and after building all packages (except sim_ros2_interface), run the setup using:
ros2 launch sim_diff_drive_bringup local_trajectory_sphere_fsm.launch.py



