A Real-Time Depth of Anesthesia Monitoring System Based on Deep Neural Network with Large EDO Tolerant EEG Analog Front-End

Yongjae Park, Su Hyun Han, Wooseok Byun, Ji Hoon Kim, Hyung Chul Lee, Seong Jin Kim

Research output: Contribution to journalArticlepeer-review

28 Scopus citations

Abstract

In this article, we present a real-time electroencephalogram (EEG) based depth of anesthesia (DoA) monitoring system in conjunction with a deep learning framework, AnesNET. An EEG analog front-end (AFE) that can compensate ±380-mV electrode DC offset using a coarse digital DC servo loop is implemented in the proposed system. The EEG-based MAC, EEGMAC, is introduced as a novel index to accurately predict the DoA, which is designed for applying to patients anesthetized by both volatile and intravenous agents. The proposed deep learning protocol consists of four layers of convolutional neural network and two dense layers. In addition, we optimize the complexity of the deep neural network (DNN) to operate on a microcomputer such as the Raspberry Pi 3, realizing a cost-effective small-size DoA monitoring system. Fabricated in 110-nm CMOS, the prototype AFE consumes 4.33 μW per channel and has the input-referred noise of 0.29 μVrms from 0.5 to 100 Hz with the noise efficiency factor of 2.2. The proposed DNN was evaluated with pre-recorded EEG data from 374 subjects administrated by inhalational anesthetics under surgery, achieving an average squared and absolute errors of 0.048 and 0.05, respectively. The EEGMAC with subjects anesthetized by an intravenous agent also showed a good agreement with the bispectral index value, confirming the proposed DoA index is applicable to both anesthetics. The implemented monitoring system with the Raspberry Pi 3 estimates the EEGMAC within 20 ms, which is about thousand-fold faster than the BIS estimation in literature.

Original languageEnglish
Article number9103093
Pages (from-to)825-837
Number of pages13
JournalIEEE Transactions on Biomedical Circuits and Systems
Volume14
Issue number4
DOIs
StatePublished - Aug 2020

Bibliographical note

Funding Information:
Manuscript received April 3, 2020; revised May 23, 2020; accepted May 24, 2020. Date of publication May 28, 2020; date of current version August 17, 2020. This work was supported in part by the Brain Research Program under Grant 2017M3C7A102885921 through the National Research Foundation (NRF) of Korea funded by the Ministry of Science and ICT & Future Planning (MSIT), in part by Samsung Research Funding & Incubation Center of Samsung Electronics under Project No. SRFC-TA1703-07, and in part by the 2020 Research Fund under Grant 1.200033.01 of Ulsan National Institute of Science and Technology (UNIST). (Yongjae Park and Su-Hyun Han contributed equally to this work.) (Corresponding author: Seong-Jin Kim.) Yongjae Park, Su-Hyun Han, and Seong-Jin Kim are with the School of Electrical and Computer Engineering, Ulsan National Institute of Science and Technology, Ulsan 44919, South Korea (e-mail: [email protected]; [email protected]; [email protected]).

Publisher Copyright:
© 2007-2012 IEEE.

Keywords

  • Bispectral index
  • Raspberry Pi 3
  • convolutional neural network
  • depth of anesthesia monitoring
  • electrode DC offset
  • electroencephalogram
  • latency
  • minimum alveolar concentration

Fingerprint

Dive into the research topics of 'A Real-Time Depth of Anesthesia Monitoring System Based on Deep Neural Network with Large EDO Tolerant EEG Analog Front-End'. Together they form a unique fingerprint.

Cite this