Class imbalance is a major issue that degrades the performance of machine learning classifiers in real-world problems. Oversampling methods have been widely used to overcome this issue by generating synthetic data from minority classes. However, conventional oversampling methods often focus only on the minority class and ignore relationships between the minority and majority classes. In this study, we propose an oversampling method called minority oversampling near the borderline with a generative adversarial network (OBGAN). To consider the minority and majority classes, OBGAN employs one independent discriminator for each class. Each discriminator competitively affects the generator to be trained to capture each region of the minority and majority classes. However, the sensitivity of the generator to the discriminator of the minority class is greater than that of the majority class. Hence, the generator learns the minority class with a focus near the borderline. In addition, the architecture and loss function of OBGAN are designed to avoid the mode collapse problem, which commonly occurs in GANs trained on relatively small datasets. Experimental results, involving 21 datasets and 6 benchmark methods, reveal that OBGAN exhibits excellent performance and stability.
- Class imbalance problem
- Deep learning
- Generative adversarial networks
- Generative learning
- Neural networks