HAND GESTURE CONTROL PESENTATION

HAND GESTURE CONTROL PESENTATION

Use the power of computer vision to control presentations with just hand gestures.

The problem HAND GESTURE CONTROL PESENTATION solves

During the COVID phase, all are classes were online and teachers used to teach using ppts. Now most teachers are not well versed with technology or have the amenities to invest in touch screen laptops or computer pencils to annotate. This is where Hand Gesture Control Presentation comes in. By using hand gestures as input, the project eliminates the need for traditional input devices such as a mouse or keyboard, providing a more natural and intuitive way to control the presentation. This can be particularly useful in scenarios where a presenter needs to have hands-free control or when using traditional input devices is not practical or accessible.
Overall, the project enhances the user experience by providing a hands-on and interactive approach to navigating and interacting with presentation slides.

Here's an explanation of each gesture implemented in the code:

Left Gesture: This gesture is detected when the hand is above the gesture threshold and the index finger is extended while the other fingers are closed. It allows you to navigate to the previous slide in the presentation. The code checks if the fingers are [1, 0, 0, 0, 0], indicating that only the index finger is extended.

Right Gesture: This gesture is detected when the hand is above the gesture threshold and the little finger is extended while the other fingers are closed. It allows you to navigate to the next slide in the presentation. The code checks if the fingers are [0, 0, 0, 0, 1], indicating that only the little finger is extended.

Draw Mode Gesture: This gesture is detected when the hand is below the gesture threshold and the index and middle fingers are extended while the other fingers are closed. It enables draw mode, where you can draw on the current slide using your index finger. The code checks if the fingers are [0, 1, 1, 0, 0], indicating that the index and middle fingers are extended.

Annotation Gesture: This gesture is deletes annotation.

Challenges I ran into

One of the main challenges is accurately detecting and tracking the hand in real time. Variations in lighting conditions, different hand shapes and sizes, occlusions, and background clutter can all pose challenges to robust hand detection and tracking.

One of the major things was introducing delays to avoid slides moving on too quickly and decreasing false positives and noise when it comes to detecting gestures.

Also towards the edges hand tracking is not so efficient so had to use NumPy to interp the x coordinate to its half range.

Users have different hand sizes, shapes, and movements, which may affect the accuracy of hand tracking and gesture recognition.

Tracks Applied (1)

Quine Track

It is Under Education Track

Quine

Discussion