TY - GEN
T1 - NavCog3
T2 - 19th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS 2017
AU - Sato, Daisuke
AU - Oh, Uran
AU - Naito, Kakuya
AU - Takagi, Hironobu
AU - Kitani, Kris
AU - Asakawa, Chieko
PY - 2017/10/19
Y1 - 2017/10/19
N2 - Navigating in unfamiliar environments is challenging for most people, especially for individuals with visual impairments. While many personal navigation tools have been proposed to enable independent indoor navigation, they have insufficient accuracy (e.g., 5-10 m), do not provide semantic features about surroundings (e.g., doorways, shops, etc.), and may require specialized devices to function. Moreover, the deployment of many systems is often only evaluated in constrained scenarios, which may not precisely reflect the performance in the real world. Therefore, we have designed and implemented NavCog3, a smartphone-based indoor navigation assistant that has been evaluated in a 21, 000 m2 shopping mall. In addition to turn-by-turn instructions, it provides information on landmarks (e.g., tactile paving) and points of interests nearby. We first conducted a controlled study with 10 visually impaired users to assess localization accuracy and the perceived usefulness of semantic features. To understand the usability of the app in a real-world setting, we then conducted another study with 43 participants with visual impairments where they could freely navigate in the shopping mall using NavCog3. Our findings suggest that NavCog3 can open a new opportunity for users with visual impairments to independently find and visit large and complex places with confidence. Copyright is held by the owner/author(s).
AB - Navigating in unfamiliar environments is challenging for most people, especially for individuals with visual impairments. While many personal navigation tools have been proposed to enable independent indoor navigation, they have insufficient accuracy (e.g., 5-10 m), do not provide semantic features about surroundings (e.g., doorways, shops, etc.), and may require specialized devices to function. Moreover, the deployment of many systems is often only evaluated in constrained scenarios, which may not precisely reflect the performance in the real world. Therefore, we have designed and implemented NavCog3, a smartphone-based indoor navigation assistant that has been evaluated in a 21, 000 m2 shopping mall. In addition to turn-by-turn instructions, it provides information on landmarks (e.g., tactile paving) and points of interests nearby. We first conducted a controlled study with 10 visually impaired users to assess localization accuracy and the perceived usefulness of semantic features. To understand the usability of the app in a real-world setting, we then conducted another study with 43 participants with visual impairments where they could freely navigate in the shopping mall using NavCog3. Our findings suggest that NavCog3 can open a new opportunity for users with visual impairments to independently find and visit large and complex places with confidence. Copyright is held by the owner/author(s).
KW - Indoor navigation
KW - Points of interest
KW - User evaluation
KW - Visual impairments
KW - Voice-based interaction
UR - http://www.scopus.com/inward/record.url?scp=85041411787&partnerID=8YFLogxK
U2 - 10.1145/3132525.3132535
DO - 10.1145/3132525.3132535
M3 - Conference contribution
AN - SCOPUS:85041411787
T3 - ASSETS 2017 - Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility
SP - 270
EP - 279
BT - ASSETS 2017 - Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility
PB - Association for Computing Machinery, Inc
Y2 - 29 October 2017 through 1 November 2017
ER -