A performance comparison of on-hand versus on-phone nonvisual input by blind and sighted users

Uran Oh, Leah Findlater

Research output: Contribution to journalArticlepeer-review

16 Scopus citations


On-body interaction, in which the user employs one's own body as an input surface, has the potential to provide efficient mobile computing access for blind users. It offers increased tactile and proprioceptive feedback compared to a phone and, because it is always available, it should allow for quick audio output control without having to retrieve the phone from a pocket or bag. Despite this potential, there has been little investigation of on-body input for users with visual impairments. To assess blind users' performance with on-body input versus touchscreen input, we conducted a controlled lab study with 12 sighted and 11 blind participants. Study tasks included basic pointing and drawing more complex shape gestures. Our findings confirm past work with sighted users showing that the hand results in faster pointing than the phone. Most important, we also show that: (1) the performance gain of the hand applies to blind users as well, (2) the accuracy of where the pointing finger first lands is higher with the hand than the phone, (3) on-hand pointing performance is affected by the location of targets, and (4) shape gestures drawn on the hand result in higher gesture recognition rates than those on the phone. Our findings highlight the potential of on-body input to support accessible nonvisual mobile computing.

Original languageEnglish
Article numbera14
JournalACM Transactions on Accessible Computing
Issue number4
StatePublished - Nov 2015

Bibliographical note

Publisher Copyright:
© 2015 ACM.


  • Blindness
  • Mobile computing
  • Nonvisual interaction
  • On-body interaction
  • User study


Dive into the research topics of 'A performance comparison of on-hand versus on-phone nonvisual input by blind and sighted users'. Together they form a unique fingerprint.

Cite this