M

Metafrasimáton

Real time translation from one sign language to another sign language, using Machine Learning techniques & Image processing.

The problem Metafrasimáton solves

In the world, there are about 300 sign languages and there is no common universal language. So, it is difficult for hearing & speech impaired people to interact with each other, when they know two different sign languages. Not just the impaired, sometimes we might know one sign language and we can't understand if someone is talking in another sign language. Over the past 30 hours, we have built a machine learning based real time sign language translator which translates signs from American Sign language to Indian Sign language. We have translated different alphabets from one sign language to another. Having such translators on the same lines of Google Translator will help the impaired to understand other sign languages. It also brings in diversity and inclusion.

Challenges we ran into

We ran into a lot of interesting challenges. A main one was real time image analysis and recognition. With the boundaries between the hand of a user and the background not clearly distinguishable due to a lot of noise, logically finding a good enough threshold was cumbersome. The lighting was a serious problem when we tried testing, so we had to find good spots for this , and we wasted a good amount of time scouting & changing places appropriate to the working. Finally, wrote a code to calculate threshold in real-time and we were thus able to handle it. The second was, we tried running lot of models on the dataset & we couldn't get good accuracy. Finally we ended up with a CNN architecture using Keras, where we had good results. We also had to learn the signs propoerly inorder to test the software, as testing was dependent on how good our sign language was too.

Discussion