While there are devices to aid the visually challenged with navigation, this still seems like an area that’s ripe for innovation. As his contribution to this technological realm, college student Satinder Singh has come up with a unique AI and glasses hardware solution, dubbed “DeepWay.”
DeepWay features a camera strapped to the user’s chest, which transmits images to a laptop computer. The computer then uses Python, along with TensorFlow, Keras, and OpenCV to process visual input, recognizing whether a person needs to turn left, right, or continue to walk straight in order to stay on the road.
User interface is handled by an Arduino, which controls a pair of servo motors that poke the wearer behind the eye to tell him or her where to go. The system can also recognize faces and stop signs, giving environmental feedback via a pair of earbuds.
As shown in the demo video below, the poking process look much less invasive that it sounds, and it’s able to work on varied road types. More info on the project is available on Singh’s GitHub page, and everything is open source, with the intent that we can all work together to help those who are visually impaired. For his part, Singh is already working on an improved version 2 of this idea, so it will be exciting to see this technology mature!