Skip to content
Leap Motion Based AI Mouse with Voice Assistant

Leap Motion Based AI Mouse with Voice Assistant

"Revolutionize your computing experience with the Leap Motion AI Mouse: Precision at Your Fingertips". With this stellar technology, your gestures become commands, and your movements become actions.

Created on 9th July 2023

Leap Motion Based AI Mouse with Voice Assistant

Leap Motion Based AI Mouse with Voice Assistant

"Revolutionize your computing experience with the Leap Motion AI Mouse: Precision at Your Fingertips". With this stellar technology, your gestures become commands, and your movements become actions.

The problem Leap Motion Based AI Mouse with Voice Assistant solves

Hand gestures are often the most spontaneous responses in any sort of communication. This paper proposes an implementation to control computer systems through simple gestures using the leap motion technology. The few use cases of a virtual mouse are that of contactless workstations in times like Covid-19, aid in computer- based presentations, support for people with carpal tunnel syndrome or other repetitive strain injuries. The idea behind this proposed work is to use an elementary desktop camera instead of the standard mouse to manage the mouse cursor functions like clicking, scrolling, etc. It permits an interactive bridge between system users and machines without the necessity of mechanical or physical devices and additional resources like mouse control functionalities. The additional integration suggested through this work is that of a laptop voice assistant that will perform a varied set of actions with simple voice commands. This includes launching or stopping the AI mouse to enable contactless device control management. The solution is built for controlling the cursor’s position without the usage of any peripheral computer device using computer vision and deep neural networks.. The suggested architecture uses OpenCV and Python alongside several libraries like MediaPipe, Pycaw, etc. The dataset used is a collected culmination of three datasets which includes diverse racial specifications, backgrounds, lighting conditions and hand articulations. This was done to incorporate more challenging scenarios for the model to learn. The experimental results proved that the proposed methodology could achieve an accuracy of 94.6% for a varied set of gestures.

Challenges we ran into

There were a few challenges such as choosing the right threshold for color segmentation and configuring the right number of stored frames that were encountered but tackled through experimentation. By focusing on areas such as gesture recognition, natural language processing, customization, accessibility, multi-device integration, UI/UX enhancements, compatibility, performance optimization, and user feedback, the project reached new heights of functionality and usability.

Tracks Applied (1)

Future AI Finalist

The Leap Motion-based AI Mouse with Virtual Voice Assistant support is a cutting-edge solution that aligns perfectly wit...Read More

Discussion

Builders also viewed

See more projects on Devfolio