Research Project:
I-Corps: Hand-held Assistive Mobility Device for the Visually Impaired Using Sensors to Feel Obstacles From a Distance and Track Their Movements

Loading...
Thumbnail Image

Date

Authors

Principal Investigators

Department

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract or Project Summary

The broader impact/commercial potential of this I-Corps project is the development of a hand-held assistive mobility device for the visually impaired. The Center for Disease Control (CDC) estimates that approximately 12 million people over the age of 40 suffer from visual impairment in the United States. Vision impairment and loss of sight are among the top 10 disabilities for individuals over the age of 18 and can have a substantial social and economic toll including, but not limited to, significant loss of productivity and diminished quality of life. Further, the annual economic impact of major vision problems in those 40 and older is estimated to be $145 billion. The proposed device holds the potential to significantly improve mobility, accessibility, and quality of life for the millions of people with no or low-vision that currently use an assistive mobility device such as the widely used ?white cane.? The technology also offers potential benefits to people working in low-visibility or disaster conditions, such as emergency first-responders or military personnel. This I-Corps project is based on the development of a hand-held device that uses remote sensing technology to control a dynamic tactile display on the hand, allowing the visually impaired to detect and classify obstacles in their environment sooner and more broadly than is possible using current assistive technology devices. Unlike existing technologies that involve touching or poking items with a white cane or rely on the individual to follow auditory cues, this device may enable individuals to feel the presence of obstacles and targets from a distance, including the relative location of multiple obstacles and being able to track the relative movements of nearby obstacles and targets. The proposed technology is an extension of research into optimized graphical displays of information derived from Sound Navigation and Ranging (SONAR) and Light Detection and Ranging (LIDAR)-based sensor systems. People in low-vision situations may be better equipped to rapidly process and exploit multi-channel spatial information when it is delivered via the hand instead of the ear. The proposed technology exploits the natural active-sensing behaviors used by humans when they reach out to explore their environment. This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Description

Grant

Keywords

Citation