ThirdEye: A Cross platform flutter app that involves Tensorflow Lite , Firebase, Some Custom Data Sets , Optical Character Recognition and text to Speech which reads data from any form of text and outputs it in form of speech for persons with visual impairment. We are using some custom tflite models to also detect objects and motions sensors for sensing in realtime.
We planned to do this to make the visually impaired understand their surroundings better, and at the same time interact with the external world in the way normal human beings do. We wanted to make it such that it completely runs locally on the mobile device without any cloud services running in the backend. This led us to our first challenge of compute available on the device which is very less compared to modern GPU's. We also wanted to make the application as simple to use as possible to use. Thus, we also wanted to integrate Google Assistant into the software stack but due to lack of much knowledge and time constraint, this was not possible. Apart from this we also encountered some minor issues which we solved as the hackathon progressed.
Discussion