Synthesizing the Roughness of Textured Surfaces for an Encountered-Type Haptic Display Using Spatiotemporal Encoding

Yaesol Kim, Siyeon Kim, Uran Oh, Young J. Kim

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

Encountered-type haptic rendering provides realistic, free-to-touch, and move-and-collide haptic sensation to a user. However, inducing haptic-texture sensation without complicated tactile actuators is challenging for encountered-type haptic rendering. In this article, we propose a novel texture synthesizing method for an encountered-type haptic display using spatial and temporal encoding of roughness, which provides both active and passive touch sensation requiring no complicated tactile actuation. Focused on macro-scale roughness perception, we geometrically model the textured surface with a grid of hemiellipsoidal bumps, which can provide a variety of perceived roughness as the user explores the surface with one's bare hand. Our texture synthesis method is based on two important hypotheses. First, we assume that perceptual roughness can be spatially encoded along the radial direction of a textured surface with hemiellipsoidal bumps. Second, perceptual roughness temporally varies with the relative velocity of a scanning human hand with respect to the surface. To validate these hypotheses on our spatiotemporal encoding method, we implemented an encountered-type haptic texture rendering system using an off-the-shelf collaborative robot that can also track the user's hand using IR sensors. We performed psychophysical user tests with 25 participants and verified the main effects of spatiotemporal encoding of a textured model on the user's roughness perception. Our empirical experiments imply that the users perceive a more rough texture as the surface orientation or the relative hand motion increases. Based on these findings, we show that our visuo-haptic system can synthesize an appropriate level of roughness corresponding to diverse visual textures by suitably choosing encoding values.

Original languageEnglish
Article number9124684
Pages (from-to)32-43
Number of pages12
JournalIEEE Transactions on Haptics
Volume14
Issue number1
DOIs
StatePublished - 1 Jan 2021

Bibliographical note

Funding Information:
Manuscript received December 15, 2019; revised May 15, 2020 and June 19, 2020; accepted June 19, 2020. Date of publication June 24, 2020; date of current version March 19, 2021. This work was supported in part by the ITRC program supervised by IITP (IITP-2020-0-01460) and the NRF (2017R1A2B3012701) in South Korea. The work of Uran Oh was supported by Ewha Womans University Research Grant of 2018. This article was recommended for publication by Associate Editor J. Park and Editor-in-Chief L. Jones upon evaluation of the reviewers’ comments. (Corresponding author: Young J. Kim.) The authors are with the Department of Computer Science and Engineering, Ewha Womans University, Seoul 03760, South Korea (e-mail: yaesol91@ewhain. net; syunni33@ewhain.net; uran.oh@ewha.ac.kr; kimy@ewha.ac.kr).

Publisher Copyright:
© 2008-2011 IEEE.

Keywords

  • Encountered-type haptic
  • haptic texture
  • human robot interaction.
  • texture roughness

Fingerprint

Dive into the research topics of 'Synthesizing the Roughness of Textured Surfaces for an Encountered-Type Haptic Display Using Spatiotemporal Encoding'. Together they form a unique fingerprint.

Cite this