High-speed visual target identification for low-cost wearable brain-computer interfaces

Dokyun Kim, Wooseok Byun, Yunseo Ku, Ji Hoon Kim

Research output: Contribution to journalArticlepeer-review

11 Scopus citations


Non-invasive brain-computer interfaces (BCI) have received a great deal of attention due to recent advances in signal processing. Two types of electroencephalograms (EEG), P300 and steady-state visual evoked potential (SSVEP), have been widely used to enable paralyzed patients to communicate with others. Although there have been many signal processing algorithms focusing on target identification accuracies such as power spectral density analysis (PSDA) and canonical correlation analysis (CCA), their high computational complexity drives up the cost of such systems. In the proposed lightweight target identification algorithm, we have focused on developing an improved information transfer rate (ITR) for high-quality communication and reducing overall implementation cost. The proposed algorithm, CCA-Lite, includes two algorithmic optimizations-signal binarization and on-the-fly covariance matrix calculation-which have enabled the development of a low-cost, single-channel, and wearable BCI system using SSVEP. The prototypical BCI system makes use of an ARM Cortex-M3-based low-cost microcontroller unit (MCU), which has been built for 1.5s SSVEP recordings. Compared to the state-of-the-art CCA-based target identification algorithm, CCA-Lite exhibits 25% better ITR and has reduced memory requirements by 92% and single-target identification cycle time by 26%.

Original languageEnglish
Article number8698249
Pages (from-to)55169-55179
Number of pages11
JournalIEEE Access
StatePublished - 2019


  • Brain-computer interface (BCI)
  • canonical correlation analysis (CCA)
  • electroencephalogram (EEG)
  • steady-state visual evoked potential (SSVEP)
  • target identification


Dive into the research topics of 'High-speed visual target identification for low-cost wearable brain-computer interfaces'. Together they form a unique fingerprint.

Cite this