Sonicstrument: A musical interface with stereotypical acoustic transducers

Jeong Seob Lee, Woon Seung Yeo

Research output: Contribution to journalConference articlepeer-review

8 Scopus citations

Abstract

This paper introduces Sonicstrument, a sound-based interface that traces the user‟s hand motions. Sonicstrument utilizes stereotypical acoustic transducers (i.e., a pair of earphones and a microphone) for transmission and reception of acoustic signals whose frequencies are within the highest area of human hearing range that can rarely be perceived by most people. Being simpler in structure and easier to implement than typical ultrasonic motion detectors with special transducers, this system is robust and offers precise results without introducing any undesired sonic disturbance to users. We describe the design and implementation of Sonicstrument, evaluate its performance, and present two practical applications of the system in music and interactive performance.

Original languageEnglish
Pages (from-to)24-27
Number of pages4
JournalProceedings of the International Conference on New Interfaces for Musical Expression
StatePublished - 2011
EventInternational conference on New Interfaces for Musical Expression, NIME 2011 - Oslo, Norway
Duration: 30 May 20111 Jun 2011

Bibliographical note

Publisher Copyright:
© 2020, Steering Committee of the International Conference on New Interfaces for Musical Expression.

Keywords

  • Audible sound
  • Doppler effect
  • Hand-free interface
  • Interactive performance
  • Musical instrument
  • Stereotypical transducers

Fingerprint

Dive into the research topics of 'Sonicstrument: A musical interface with stereotypical acoustic transducers'. Together they form a unique fingerprint.

Cite this