Abstract
Recent advances in computer vision and natural language processing using deep neural networks (DNNs) have enabled rich and intuitive multimodal interfaces. However, research on intelligent assistance systems for persons with visual impairment has not been well explored. In this work, we present an interactive object recognition and guidance interface based on multimodal interaction for blind and partially sighted people using an embedded mobile device. We demonstrate that the proposed solution using DNNs can effectively assist visually impaired people. We believe that this work will provide new and helpful insights for designing intelligent assistance systems in the future.
Original language | English |
---|---|
Title of host publication | UIST 2019 Adjunct - Adjunct Publication of the 32nd Annual ACM Symposium on User Interface Software and Technology |
Publisher | Association for Computing Machinery, Inc |
Pages | 27-29 |
Number of pages | 3 |
ISBN (Electronic) | 9781450368179 |
DOIs | |
State | Published - 14 Oct 2019 |
Event | 32nd Annual ACM Symposium on User Interface Software and Technology, UIST 2019 - New Orleans, United States Duration: 20 Oct 2019 → 23 Oct 2019 |
Publication series
Name | UIST 2019 Adjunct - Adjunct Publication of the 32nd Annual ACM Symposium on User Interface Software and Technology |
---|
Conference
Conference | 32nd Annual ACM Symposium on User Interface Software and Technology, UIST 2019 |
---|---|
Country/Territory | United States |
City | New Orleans |
Period | 20/10/19 → 23/10/19 |
Bibliographical note
Publisher Copyright:© 2019 Copyright is held by the owner/author(s).
Keywords
- Assistive system
- Mobile interface
- Multimodal wearable interface
- Visual impairment