NavCog – Empowering the differently abled 5/5 (1)

Chieko Asakawa using NavCog to navigate through the college. She is an IBM fellow and visiting faculty at Carnegie Mellon University, is a lead researcher on NavCog and is visually impaired herself.

IBMhas been much appreciated for its project Watson, bringing new technologies to enhance the cognitive functioning of individuals. IBM is now working on a similar project with scientists from Carnegie Mellon University to bring enhanced cognitive assistance to the visually impaired. This project – NavCog, shall help such people navigate their surroundings, freely and without human assistance.

The app is in its pilot stage with only a basic version released for Apple and Android smartphones. At present, it offers voice-based or vibration-based navigation instructions, by analyzing signals from Bluetooth beacons installed on walkways. Further plans to develop this app include making it more reliant on camera-based information rather than Bluetooth signals, for better and more detailed assessment of the environment and which will help in widening its applicability in varied locations.

Enabling analysis of camera based data will also make it possible to incorporate additional features like face recognition and recognition of facial expressions.


NavCog is an open source platform and developers can access it through IBM’s BlueMix cloud.

Please rate this