Abstract
The HandSight project investigates how wearable micro-cameras can be used to augment a blind or visually impaired user's sense of touch with computer vision. Our goal is to support an array of activities of daily living by sensing and feeding back non-tactile information (e.g., color, printed text, patterns) about an object as it is touched. In this poster paper, we provide an overview of the project, our current proof-of-concept prototype, and a summary of findings from finger-based text reading studies. As this is an early-stage project, we also enumerate current open questions.
Original language | English |
---|---|
Title of host publication | ASSETS 2015 - Proceedings of the 17th International ACM SIGACCESS Conference on Computers and Accessibility |
Publisher | Association for Computing Machinery, Inc |
Pages | 383-384 |
Number of pages | 2 |
ISBN (Electronic) | 9781450334006 |
DOIs | |
State | Published - 26 Oct 2015 |
Event | 17th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS 2015 - Lisbon, Portugal Duration: 26 Oct 2015 → 28 Oct 2015 |
Publication series
Name | ASSETS 2015 - Proceedings of the 17th International ACM SIGACCESS Conference on Computers and Accessibility |
---|
Conference
Conference | 17th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS 2015 |
---|---|
Country/Territory | Portugal |
City | Lisbon |
Period | 26/10/15 → 28/10/15 |
Bibliographical note
Publisher Copyright:© 2015 ACM.
Keywords
- Blind
- Computer vision
- Vision-augmented touch
- Visually impaired
- Wearable computing