You are here

Assistive Navigation

Project Lifetime: 
Sep 2013

ISANA: Intelligent Situation Awareness and Navigation Aid for Visually Impaired Persons
Description:
This project focus on exploring and developing situation awareness and assistive navigation technologies to provide blind or visually impaired persons with obstacle avoidance and intelligent wayfinding capabilities by using wearable sensors (e.g., cameras, RGBD camera, 3D orientation sensors, pedometers). We will first focus on algorithm development and evaluation in indoor environments. Then the technology will be further improved and research will be extended to outdoor pedestrian environments to provide blind users with waypoint navigation, path planning and advanced warning of events through interaction with GPS, Geographic Information System (GIS) and Intelligent Transportation Systems (ITS) infrastructure. Our research will focuses on four key issues parallel to ones comprising the Event Horizon concept (i.e., sensing, human interface and algorithms): 1) Situation awareness analysis and understanding including recognition and detection of stationary objects (e.g. doors, elevators, stairs, crosswalks, traffic light, etc.), reading and recognition of important text and signage based on user’s query, and detection, tracking, and representation of moving objects and dynamic changes (e.g. people, shopping cart, door opening, bus, cars, etc.). 2) Assistive navigation algorithms based on simultaneous localization and mapping (SLAM) principle including generation and updating of a navigation map, registration landmarks in the map, generation of verbal description or tactile display for blind users to obtain global perception of the environment and plan the path in response to advanced warning from ITS. 3) Vision-free user interface and usability studies including auditory guidance and spatial updating of object location, orientation, and distance. 4) Prototyping and Testing including development of a prototype of human-centric assistive navigation system using wearable sensors and assistive technology audio-based display unit to test and verify the proposed methods and software.
Sponsor: This project is sponsored by Federal High Way Administration's EXPLORATORY ADVANCED RESEARCH PROGRAM under UNDER COOPERATIVE AGREEMENT: FHWA DTFH61-12-H-00002
PIs: Prof. Yingli Tian, Prof. Jizhong Xiao of CCNY and Aries Arditi of Visibility Metrics LLC