B

Blind People Helper AppVision

Android App Assistant For Visually Impaired to Navigate through Obstacles

The problem Blind People Helper AppVision solves

According to the implementations in previous studies, assistive devices for navigation for visually impaired people still focus on location and distance sensing, but cannot warn users about the types of obstacles in front of them. Moreover, distance sensing cannot provide additional information to help visually impaired people to understand their surroundings. Therefore, the practicability of such assistive devices is very low. Some solutions using RFID chips are expensive and vulnerable to damage from the sun and rain. Therefore, the current study proposes a navigation system for visually impaired people; this system employs a smartphone and deep learning algorithms to recognize various obstacles. The proposed system is not limited to specific indoor or outdoor environments and does not require the positioning of RFID chips in advance. Thus, the proposed system not only increases the number of available locations but also provides more information for visually impaired people about their surroundings.

The proposed navigation system employs a smartphone to continually capture images of the environment in front of a user and perform image processing and object identification to inform the user of the image results. According to these results, the user can gain a more comprehensive understanding of the surroundings. This system enables visually impaired people to not only know the rough direction and distance to an obstacle but also know what the obstacle is.

Challenges we ran into

  1. Less knowledge of app development (Went through various tutorials and open source projects)
  2. Frequent crashing(Optimised the app)
  3. Slow Realtime object detection(Solved it using Tensorflow Lite API which uses a pre-trained quantized COCO SSD MobileNet v1 model.)

Discussion