Simplifying Neural Network Training Under Class Imbalance

December 05, 2023 ยท Entered Twilight ยท ๐Ÿ› Neural Information Processing Systems

๐Ÿ’ค TWILIGHT: Eternal Rest
Repo abandoned since publication

Repo contents: .idea, README.md, expriments, imbalanced, requirements.txt, self_supervised, setup.py

Authors Ravid Shwartz-Ziv, Micah Goldblum, Yucen Lily Li, C. Bayan Bruss, Andrew Gordon Wilson arXiv ID 2312.02517 Category cs.LG: Machine Learning Cross-listed cs.AI Citations 33 Venue Neural Information Processing Systems Repository https://github.com/ravidziv/SimplifyingImbalancedTraining โญ 9 Last Checked 2 months ago
Abstract
Real-world datasets are often highly class-imbalanced, which can adversely impact the performance of deep learning models. The majority of research on training neural networks under class imbalance has focused on specialized loss functions, sampling techniques, or two-stage training procedures. Notably, we demonstrate that simply tuning existing components of standard deep learning pipelines, such as the batch size, data augmentation, optimizer, and label smoothing, can achieve state-of-the-art performance without any such specialized class imbalance methods. We also provide key prescriptions and considerations for training under class imbalance, and an understanding of why imbalance methods succeed or fail.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning