This project demonstrates how to control your system volume using hand gestures. The implementation utilizes computer vision techniques, specifically Mediapipe for hand tracking and OpenCV for image processing, along with Pycaw for controlling the system audio.
- Real-time Hand Tracking: Uses Mediapipe to detect and track hand landmarks in real-time.
- Gesture Recognition: Recognizes the distance between thumb and index finger to adjust the system volume.
- Volume Control: Interacts with the system's audio settings using Pycaw to set the volume based on the recognized gestures.
-
Clone the Repository
git clone https://github.com/yourusername/Volume-Control.git cd Volume-Control -
Create a Virtual Environment
python -m venv venv source venv/bin/activate # For Unix or MacOS venv\Scripts\activate # For Windows
-
Install the Required Packages
pip install -r requirements.txt
-
Run the Script
python main.py
- Ensure your webcam is connected and working.
- Run the
main.pyscript. - The script will open a window displaying the camera feed.
- Use your hand gestures to control the volume:
- Move your thumb and index finger closer or apart to decrease or increase the volume.
- A visual feedback bar on the screen indicates the current volume level.
The main components of the project include:
- Hand Detection and Tracking: Utilizes Mediapipe's Hand module to detect and track hand landmarks.
- Distance Calculation: Calculates the distance between thumb and index finger to determine the gesture.
- Volume Control: Uses Pycaw to set the system volume based on the gesture.
- Initialization: Set up Mediapipe, Pycaw, and OpenCV.
- Hand Landmark Detection: Detect and draw hand landmarks on the webcam feed.
- Gesture Recognition: Calculate the distance between specific landmarks (thumb and index finger) to determine the gesture.
- Volume Adjustment: Map the gesture distance to the system volume range and adjust accordingly.
