Touchless Human-Robot Interaction Using Computer Vision & Wireless Communication
Image Placeholder 1
Add your project image here

Image Placeholder 2
Add your project image here

Image Placeholder 3
Add your project image here

Image Placeholder 4
Add your project image here

A human-computer interaction (HCI) project that enables touchless, intuitive control of a robotic manipulator. By leveraging computer vision and wireless communication, this system translates natural hand movements into precise mechanical actions in real-time, bridging the gap between digital gesture recognition and physical actuation.
The system eliminates physical controllers (joysticks/keyboards) by using a standard webcam to track hand movements, achieving 95%+ recognition accuracy while providing immediate visual feedback through skeletal overlay visualization.
Computer vision pipeline running on host machine that captures video, tracks hand landmarks, and classifies gestures using MediaPipe
Wireless link using Bluetooth (HC-05) transmitting encoded command signals from host computer to robot controller
Arduino-based embedded system that parses received commands and drives servo motors to replicate intended motion
95%+ recognition accuracy using MediaPipe's ML model to detect 21 3D hand keypoints with visual skeletal overlay feedback
HC-05 module enables untethered operation with low-latency command transmission (~10m range)
Maps specific hand gestures to robotic axes: Index up (lift), Fist (grip), Open palm (release), Hand tilt (drive base)
Wheeled chassis enables Pick and Place tasks with unified gesture interface for manipulation and navigation
Captures frames, converts to RGB, processes through MediaPipe Hands solution extracting 21 landmark coordinates. Custom logic calculates Euclidean distances between landmarks (e.g., thumb tip to index tip) for dynamic gesture recognition.
Translates abstract gestures into specific servo angles. For example, "V-sign" maps to "set servo 2 to 90 degrees" ensuring predictable movement patterns for intuitive control.
Implements concise protocol sending single-character command bytes rather than complex strings. Arduino processes instructions within millisecond loop cycle required for smooth servo PWM generation.
✓ Implemented moving average smoothing filter and confidence threshold (min_detection_confidence=0.7) to ignore low-quality frames
✓ Separate high-current power source for motors while keeping Arduino logic on stable regulated supply with shared common ground
✓ Optimized Python loop with lower resolution (640x480) and increased Bluetooth baud rate (38400+) to maximize throughput
Controlling artificial limbs using muscle signals or visual gestures
Operators control heavy machinery from safe distance without contact
Touchless control of surgical tools in sterile environments
Control appliances using simple hand signs for elderly/disabled
Platform for HCI and robotics learning and experimentation
Interactive gaming and immersive gesture-based experiences
The Gesture-Controlled Robot Arm successfully demonstrates the power of integrating modern AI-based computer vision with classical embedded robotics. It creates a seamless, intuitive interface that makes controlling complex hardware as simple as waving a hand, highlighting the future of natural Human-Computer Interaction.
This project demonstrates advanced human-robot interaction, computer vision, and wireless embedded systems. Feel free to reach out for collaboration or technical discussions.