Al Based Advanced Navigation Assistant for the Visually Impaired

Authors

  • Gayitri H M, Xavier Fernandis, Shrikant Malagai, Ujwal Naik, Shobit Naik, Kumar J R

DOI:

https://doi.org/10.17762/msea.v71i4.834

Abstract

Vision plays an important role in our existence. without the assisted navigation in outdoor environments the visually impaired people face a lot of problems. This paper presents a system that helps the blind people  to navigate without much difficulty. The system incorporates a voice-based user interface where the user interacts with the system by employing a microphone and listens to the instructions through speakers. The system also incorporates raspberry pi as a mini-computer. To feed the user with the appropriate directions to reach his destination the Google Maps API is interfaced into the system as well. The system also detects the obstacles and lets the user know the location and the distance of a particular object. The YOLO model is employed to detect the object and the DisNet algorithm model is built on top of the YOLO algorithm to get the distance of the object detected from the YOLO model. The system is multi-programmed to run the modules simultaneously to attain real-time behavior. The user can query the system and ask for the estimated distance from the current position to the desired destination, the estimated time of travel, and the objects present in the frame of reference.

Downloads

Published

2022-09-17

How to Cite

Gayitri H M, Xavier Fernandis, Shrikant Malagai, Ujwal Naik, Shobit Naik, Kumar J R. (2022). Al Based Advanced Navigation Assistant for the Visually Impaired. Mathematical Statistician and Engineering Applications, 71(4), 2754–2764. https://doi.org/10.17762/msea.v71i4.834

Issue

Section

Articles