2024
In this project, I designed a system to control servos on each finger of a robotic hand, enabling eight distinct gestures and assigning an LED to each gesture using three switches on the DE-Nano10. Additionally, I developed gesture detection software using Python, enabling real-time gesture and movement detection.
Developing an interactive robotic hand that responds accurately to human gestures requires real-time gesture detection and precise motor control. This project aimed to address this challenge, enabling a robotic hand to mimic human gestures with high accuracy.
I explored various computer vision techniques to achieve reliable gesture detection. After validating the system with multiple test subjects under different lighting conditions, I optimized the algorithm for real-time performance.
Using Python and OpenCV, I built a gesture recognition model that detects hand positions and sends signals to control the robotic hand's servos. The DE-Nano10 FPGA board allowed precise control of the servo motors, while Arduino was used to manage LED indicators for each gesture.
This project demonstrates the potential for AI-driven computer vision to control robotic movements with precision. Future work will focus on refining the gesture recognition model and exploring applications in assistive robotics.
← Back to Projects