TY - JOUR
T1 - OBGAN
T2 - Minority oversampling near borderline with generative adversarial networks
AU - Jo, Wonkeun
AU - Kim, Dongil
N1 - Funding Information:
This work was partially supported by Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No. 2020-0-01441 , Artificial Intelligence Convergence Research Center (Chungnam National University), and No. 2019-0-01343 , Training Key Talents in Industrial Convergence Security), and the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No. 2020R1F1A1075781 ).
Publisher Copyright:
© 2022 Elsevier Ltd
PY - 2022/7/1
Y1 - 2022/7/1
N2 - Class imbalance is a major issue that degrades the performance of machine learning classifiers in real-world problems. Oversampling methods have been widely used to overcome this issue by generating synthetic data from minority classes. However, conventional oversampling methods often focus only on the minority class and ignore relationships between the minority and majority classes. In this study, we propose an oversampling method called minority oversampling near the borderline with a generative adversarial network (OBGAN). To consider the minority and majority classes, OBGAN employs one independent discriminator for each class. Each discriminator competitively affects the generator to be trained to capture each region of the minority and majority classes. However, the sensitivity of the generator to the discriminator of the minority class is greater than that of the majority class. Hence, the generator learns the minority class with a focus near the borderline. In addition, the architecture and loss function of OBGAN are designed to avoid the mode collapse problem, which commonly occurs in GANs trained on relatively small datasets. Experimental results, involving 21 datasets and 6 benchmark methods, reveal that OBGAN exhibits excellent performance and stability.
AB - Class imbalance is a major issue that degrades the performance of machine learning classifiers in real-world problems. Oversampling methods have been widely used to overcome this issue by generating synthetic data from minority classes. However, conventional oversampling methods often focus only on the minority class and ignore relationships between the minority and majority classes. In this study, we propose an oversampling method called minority oversampling near the borderline with a generative adversarial network (OBGAN). To consider the minority and majority classes, OBGAN employs one independent discriminator for each class. Each discriminator competitively affects the generator to be trained to capture each region of the minority and majority classes. However, the sensitivity of the generator to the discriminator of the minority class is greater than that of the majority class. Hence, the generator learns the minority class with a focus near the borderline. In addition, the architecture and loss function of OBGAN are designed to avoid the mode collapse problem, which commonly occurs in GANs trained on relatively small datasets. Experimental results, involving 21 datasets and 6 benchmark methods, reveal that OBGAN exhibits excellent performance and stability.
KW - Class imbalance problem
KW - Deep learning
KW - Generative adversarial networks
KW - Generative learning
KW - Neural networks
KW - Oversampling
UR - http://www.scopus.com/inward/record.url?scp=85125636243&partnerID=8YFLogxK
U2 - 10.1016/j.eswa.2022.116694
DO - 10.1016/j.eswa.2022.116694
M3 - Article
AN - SCOPUS:85125636243
SN - 0957-4174
VL - 197
JO - Expert Systems with Applications
JF - Expert Systems with Applications
M1 - 116694
ER -