First Talk
This app bridges the gap between Deaf/Mute people and common people. It allows Deaf/Mute people to have regular conversations with common people on day to day basis, using our live detection system.
Created on 15th May 2022
•
First Talk
This app bridges the gap between Deaf/Mute people and common people. It allows Deaf/Mute people to have regular conversations with common people on day to day basis, using our live detection system.
The problem First Talk solves
Due to the disabilities faced by these challenged people, they find it difficult to have regular conversations with common persons on a day-to-day basis. This app aims to bridge this gap. Using the live detection technology, these challenged people can talk with the common people. When these challenged people perform their sign language, the app recognizes their sign language and outputs them in text format. This app also has a learning section where common users can take up courses to learn sign language.
FEATURES OF THIS APP
-> Our app mainly uses a detection technology, through which the Deaf/Mute people act their sign language & the app outputs the text they are trying to communicate.
-> Also using this detection technology, the user can perform Google searches, and YouTube Searches through sign language.
-> Apart from detection technology and its uses, the app also contains a learning section where anyone can take up free courses to learn the language from scratch.
Challenges we ran into
-> Integrating the machine model with the app functioning on Android studio.
- Our machine model created a .h5 Keras file but it was not functioning with Android studio code. Later, on converting .h5 file into .tflite file, the code worked.
