NavCog3: An evaluation of a smartphone-based blindindoor navigation assistant with semantic features in a large-scale environment

Daisuke Sato, Uran Oh, Kakuya Naito, Hironobu Takagi, Kris Kitani, Chieko Asakawa

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

80 Scopus citations

Abstract

Navigating in unfamiliar environments is challenging for most people, especially for individuals with visual impairments. While many personal navigation tools have been proposed to enable independent indoor navigation, they have insufficient accuracy (e.g., 5-10 m), do not provide semantic features about surroundings (e.g., doorways, shops, etc.), and may require specialized devices to function. Moreover, the deployment of many systems is often only evaluated in constrained scenarios, which may not precisely reflect the performance in the real world. Therefore, we have designed and implemented NavCog3, a smartphone-based indoor navigation assistant that has been evaluated in a 21, 000 m2 shopping mall. In addition to turn-by-turn instructions, it provides information on landmarks (e.g., tactile paving) and points of interests nearby. We first conducted a controlled study with 10 visually impaired users to assess localization accuracy and the perceived usefulness of semantic features. To understand the usability of the app in a real-world setting, we then conducted another study with 43 participants with visual impairments where they could freely navigate in the shopping mall using NavCog3. Our findings suggest that NavCog3 can open a new opportunity for users with visual impairments to independently find and visit large and complex places with confidence. Copyright is held by the owner/author(s).

Original languageEnglish
Title of host publicationASSETS 2017 - Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility
PublisherAssociation for Computing Machinery, Inc
Pages270-279
Number of pages10
ISBN (Electronic)9781450349260
DOIs
StatePublished - 19 Oct 2017
Event19th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS 2017 - Baltimore, United States
Duration: 29 Oct 20171 Nov 2017

Publication series

NameASSETS 2017 - Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility

Conference

Conference19th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS 2017
Country/TerritoryUnited States
CityBaltimore
Period29/10/171/11/17

Keywords

  • Indoor navigation
  • Points of interest
  • User evaluation
  • Visual impairments
  • Voice-based interaction

Fingerprint

Dive into the research topics of 'NavCog3: An evaluation of a smartphone-based blindindoor navigation assistant with semantic features in a large-scale environment'. Together they form a unique fingerprint.

Cite this