Identifying and Compensating for Feature Deviation in Imbalanced Deep Learning


الملخص بالإنكليزية

We investigate learning a ConvNet classifier with class-imbalanced data. We found that a ConvNet significantly over-fits the minor classes that do not have sufficient training instances, which is quite opposite to a traditional machine learning model like logistic regression that often under-fits minor classes. We conduct a series of analysis and argue that feature deviation between the training and test instances serves as the main cause. We propose to incorporate class-dependent temperatures (CDT) in learning a ConvNet: CDT forces the minor-class instances to have larger decision values in the training phase, so as to compensate for the effect of feature deviation in the test data. We validate our approach on several benchmark datasets and achieve promising performance. We hope that our insights can inspire new ways of thinking in resolving class-imbalanced deep learning.

تحميل البحث