Approximate training of one-class support vector machines using expected margin

Seokho Kang, Dongil Kim, Sungzoon Cho

Research output: Contribution to journalArticlepeer-review

5 Scopus citations


One-class support vector machine (OCSVM) has demonstrated superior performance in one-class classification problems. However, its training is impractical for large-scale datasets owing to high computational complexity with respect to the number of training instances. In this study, we propose an approximate training method based on the concept of expected margin to obtain a model identical to full training with reduced computational burden. The proposed method selects prospective support vectors using multiple OCSVM models trained on small bootstrap samples of an original dataset. The final OCSVM model is trained using only the selected instances. The proposed method is not only simple and straightforward but also considerably effective in improving the training efficiency of OCSVM. Preliminary experiments are conducted on large-scale benchmark datasets to examine the effectiveness of the proposed method in terms of approximation performance and computational efficiency.

Original languageEnglish
Pages (from-to)772-778
Number of pages7
JournalComputers and Industrial Engineering
StatePublished - Apr 2019

Bibliographical note

Funding Information:
This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT; Ministry of Science and ICT) (No. NRF-2017R1C1B5075685 ). This work was also supported by Chungnam National University (No. 2018-0611-01 ).

Publisher Copyright:
© 2019 Elsevier Ltd


  • Approximate training
  • Data selection
  • Expected margin
  • One-class support vector machine
  • Support vector data description


Dive into the research topics of 'Approximate training of one-class support vector machines using expected margin'. Together they form a unique fingerprint.

Cite this