Created on 21st July 2024
•
The “Voice Assistant with Hand Sign Interpretations” represents a groundbreaking leap in facilitating communication for speech and hearing-impaired individuals. By harnessing advanced camera technology and machine learning, this innovative project interprets hand sign language in real-time, enabling seamless interaction between sign language users and the broader community. This solution empowers the speech and hearing impaired, fostering greater inclusivity and understanding in society.
In addition to its primary function, the integration of a voice assistant significantly enhances the project's utility, providing users with the convenience of voice-activated commands and information retrieval. This dual functionality ensures that users can leverage both sign language interpretation and the extensive capabilities of a voice assistant, creating a versatile and accessible tool for everyday communication. The “Voice Assistant with Hand Sign Interpretations” exemplifies the transformative potential of technology in creating a more inclusive and connected world.
At first it was hard to integrate and find APIs required for our project, but later we found the apt one.
Also the hand sign language translation had a problem with extracting more features than required but later we restricted its extraction capabilities to a certain limit to improve the efficiency of the working model.
Our next problem was hosting, we need to run our server first before running the website, which requires a dynamic hosting platform which we didn't get. So currently we are locally hosting our website by running server first and then running live server through index.html. But the good thing is that each and every feature of our project runs efficiently exactly as shown in our demo Video. Our project can be run live by git cloning our repo.
Technologies used