IBM Research worked in collaboration with Carnegie Mellon University, Robotics Institute, and Shimizu Corporation to create a mobile navigation app called NavCog. It is an iPhone app designed to help people with different visual disabilities. The app gives directions for people to navigate indoors, especially if it’s an unfamiliar area. It was first launched in 2017 and has gotten many improvements since.
How does it work?
NavCog uses an algorithm that combines smartphone sensors and Bluetooth low energy beacons. Users just have to enter their destination through voice recognition, and they get direction by audio or via haptic cues.
The beacons help the app to locate the user inside the building and give accurate directions. It is also designed to consider accessibility routes when guiding. The current version of the App Store has been tested in four locations. These are the Pittsburg International Airport, the Carnegie Mellon University campus, the DoubleTree by Hilton Hotel Green Tree, and the Shimizu Corporation HQ in Japan.

The NavCog project is open-sourced which means that many people or organizations can contribute and enhance it. Anyone who wants to add to the project has to follow four simple steps: set up and sample the Bluetooth beacons in the building, generate the topology map, and add accessibility info.
Takeaway
There are already five more pilot studies in Pittsburg and Japan. This mobile navigation app could make a huge difference for people with visual impairments. It has few locations now, but, with enough contribution, it could be implemented in many frequented areas like airports, hospitals, hotels, and malls.
Also interesting: eSight Wearable Helps Legally Blind People to See [Interview]
While the NavCog app primarily is intended to help visually impaired individuals, it could also assist people who simply need directions inside a building. It still has a long way to go because it doesn’t support many devices or operating systems, but the app is worth exploring and expanding on.
YouTube: Field Experiment in Nihonbashi, Tokyo (Feb 2017)
Photo credit: The images shown are owned by IBM research.
Editorial notice: Some of the links in this article are Amazon affiliate links. If you buy via these links, we might receive a small percentage of the purchasing price from Amazon. The price for you doesn’t change because of that.