R.I.P.
๐ป
Ghosted
Simplifying Neural Network Training Under Class Imbalance
December 05, 2023 ยท Entered Twilight ยท ๐ Neural Information Processing Systems
Repo contents: .idea, README.md, expriments, imbalanced, requirements.txt, self_supervised, setup.py
Authors
Ravid Shwartz-Ziv, Micah Goldblum, Yucen Lily Li, C. Bayan Bruss, Andrew Gordon Wilson
arXiv ID
2312.02517
Category
cs.LG: Machine Learning
Cross-listed
cs.AI
Citations
33
Venue
Neural Information Processing Systems
Repository
https://github.com/ravidziv/SimplifyingImbalancedTraining
โญ 9
Last Checked
2 months ago
Abstract
Real-world datasets are often highly class-imbalanced, which can adversely impact the performance of deep learning models. The majority of research on training neural networks under class imbalance has focused on specialized loss functions, sampling techniques, or two-stage training procedures. Notably, we demonstrate that simply tuning existing components of standard deep learning pipelines, such as the batch size, data augmentation, optimizer, and label smoothing, can achieve state-of-the-art performance without any such specialized class imbalance methods. We also provide key prescriptions and considerations for training under class imbalance, and an understanding of why imbalance methods succeed or fail.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Machine Learning
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
R.I.P.
๐ป
Ghosted
Semi-Supervised Classification with Graph Convolutional Networks
R.I.P.
๐ป
Ghosted
Proximal Policy Optimization Algorithms
R.I.P.
๐ป
Ghosted