Skip to content

RobotTracking is a robotic hand system combining Python (OpenCV, Mediapipe, cvzone) and Arduino Mega with MG996R servos. Using computer vision and UART communication, it replicates human gestures in real time. Based on inMoov3D models, it enables Human-Machine Interaction, Human-Robot Interaction, and teleoperation.

License

Notifications You must be signed in to change notification settings

AlecWaumans/RoboticTracking_Human-Machine

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RobotTracking – Hand Tracking, Human-Robot Interaction & Teleoperation

Python Arduino OpenCV Mediapipe License Status


Overview

RobotTracking is a project focused on building a robotic hand controlled through real-time hand tracking using a camera.
It belongs to the following domains:

  • Human-Machine Interface (HMI) → The camera and algorithms serve as a natural interface between human gestures and robotic commands.
  • Human-Robot Interaction (HRI) → The robotic hand reacts in real-time by imitating human movements.
  • Teleoperation → Enables controlling a robotic system remotely by replicating human gestures.

The goal is to reproduce human hand gestures in real time using a system combining:

  • Python (computer vision) for hand detection and image processing.
  • Arduino Mega (real-time control) for servo management.
  • MG996R servo motors and a 3D-printed robotic hand for physical execution.

General Workflow

  1. Computer Vision (Python)

    • Hand detection using Mediapipe (keypoints extraction).
    • Image processing via OpenCV, simplified with cvzone.
    • Natural Human-Machine Interface: gestures directly control the robot.
  2. Communication

    • Data is transmitted over UART (serial USB communication).
    • Example: detected finger angle → sent as a serial command → interpreted by Arduino.
  3. Arduino & Actuators

    • Arduino receives servo angle data.
    • MG996R servos pull fishing wires attached to robotic fingers.
    • Direct Human-Robot Interaction: human gestures reproduced by mechanical motion.
  4. 3D Models

    • The robotic hand structure is based on the inMoov3D humanoid open-source model,
      adapted and printed in PLA using 3D printing technology.
    • Fishing wires and elastic bands ensure mechanical articulation and finger return.

Project Structure

RobotTracking-main/
│── setup.sh                 # Installation script
│── run.sh                   # Execution script
│── requirements.txt          # Python dependencies
│── LICENSE                  # MIT license
│── README.md                 # Documentation
│── Rapport.pdf               # Original report (content integrated here)
│── .gitignore                # Git ignore list
│
├── Tracking Python/          # Computer Vision
│   └── Vision.py             # Hand tracking script
│
└── Algo Arduino/             # Arduino control logic
    └── MoveFinger.ino        # Arduino servo control code

Visuals

Open Close


Installation

Main libraries:

  • OpenCV → Image processing
  • Mediapipe → Hand detection and landmark extraction
  • cvzone → Simplifies OpenCV + Mediapipe integration
  • pyserial → UART communication with Arduino

Run & Setup the system

# Automatic setup
bash setup.sh

# Start hand tracking
bash run.sh

Features

  • Human-Machine Interface (HMI) → Control via natural gestures.
  • Real-time Hand Tracking → Finger detection and articulation.
  • UART Communication → Reliable data transfer between Python and Arduino.
  • Human-Robot Interaction (HRI) → Robot responds to human input in real time.
  • Teleoperation → Enables remote robotic hand control.
  • Servo control (MG996R) → Mechanical finger articulation.

Technical Details

Tracking Python (Tracking Python/Vision.py)

  • Uses cvzone + Mediapipe + OpenCV for finger position detection.
  • Extracts hand keypoints and maps them to servo angles.
  • Sends values to Arduino through UART.
  • Must to be on 3.8 python version.

Arduino Logic (Algo Arduino/MoveFinger.ino)

  • Written in C++ (Arduino IDE).
  • Reads incoming serial data.
  • Controls MG996R servos accordingly.
  • Servos pull fishing wires → robotic fingers mimic human gestures.

Results

  • The robotic hand successfully replicates human gestures with good reactivity.
  • Smooth finger articulation achieved using servos and elastic return.
  • Effective HMI: intuitive gesture-based control.
  • Tangible HRI: direct interaction between human and machine.
  • Limitations:
    • Latency between gesture detection and servo action.
    • Sensitivity to lighting and camera conditions.
    • Mechanical constraints (elasticity of wires, friction).

Authors

  • Waumans Alec
  • Student in Industrial Computer Science – 29/03/2024

License

This project is licensed under the terms of the MIT License.
You are free to use, modify, and distribute it, provided that credit is given to the original authors.

For more details, please refer to the LICENSE file included in this repository.

Future Improvements

  • Optimize UART communication (e.g., faster or custom protocols).
  • Improve mechanical robustness (reduce friction and wire elasticity).
  • Extend to a complete robotic hand or arm.
  • Develop graphical user interfaces (GUI) for teleoperation.
  • Integrate AI for advanced gesture recognition and more natural HRI.

About

RobotTracking is a robotic hand system combining Python (OpenCV, Mediapipe, cvzone) and Arduino Mega with MG996R servos. Using computer vision and UART communication, it replicates human gestures in real time. Based on inMoov3D models, it enables Human-Machine Interaction, Human-Robot Interaction, and teleoperation.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published