Thesis
SEP Papers
Machine Learning Papers
-
Decoupling Representation and Classifier for Long-Tailed Recognition
-
Includes a good overview of instance-based sampling, class-based sampling, square-root sampling, and progressively-balanced sampling.
Also a good overview of Classifier Re-training (cRT), Nearest Class Mean classifier (NCM), tau-normalized classifier, and learnable weight scaling (LWS).
-
Distribution Alignment: A Unified Framework for Long-tail Visual Recognition
-
Talks about learning the representation, freezing the weights, and then calibrate class scores using reweighting
-
BBN: Bilateral-Branch Network with Cumulative Learning for Long-Tailed Visual Recognition
-
Multi-branch learning including separating representation and classification into separate branches and training on different examples.
Convential learning branch is normal, imbalanced distribution with uniform sampling. Re-balancing branch is reversed sampler.
-
Contrastive Learning based Hybrid Networks for Long-Tailed Image Classification by Peng Wang et al.
-
Hybrid network structure. Feature learning and classifier are seprated and share a backbone network which produces a representation r.
-
Improving Calibration for Long-Tailed Recognition by Zhisheng Zhong et al.
-
Introduces the idea of label-aware smoothing to solve over-confidence from normal method
-
SMOTEBoost for Regression: Improving the Prediction of Extreme Values by Nuno Moniz et al.
-
The SMOTE part is an approach for generating synthetic examples by interpolating existing ones.
-
The Boost part is a method for converting a weak learning algorithm into one that achieves high accuracy.
-
Delving into Deep Imbalanced Regression by Yuzhe Yang et al.
-
All about smoothing. Label distribution smoothing (LDS). Feature Distribution Smoothing (FDS)
-
Density-based Weighting for Imbalanced Regression by Michael Steininger et al.
-
Proposes a sample weighting approach called DenseWeight which they include into a cost-sensitive learning approach called DenseLoss.
These are meant to be used with imbalanced regression datasets.
-
RankSim: Ranking Similarity Regularization for Deep Imbalanced Regression by Yu Gong et al.
-
The problem space focuses on imbalanced data sets with a regression target value. Their idea is that the sorted list of neighbors in label space should match the sorted list of neighbors in the feature space for a given data sample.
-
It's worth noting that their method does not deal with the problem that the dataset is imbalanced. Their method therefore had to be applied to existing methods for dealing with imbalanced data including Focal-R, RRT, and SQINV.
-
Balanced MSE for Imbalanced Visual Regression by Jiawei Ren et al.
-
In this paper, they introduce Balanced Mean Squared Error (Balanced MSE) over the widely used Mean Squared Error (MSE) for imbalanced regression.
-
MSE is found to perform worse on the minority class because of the imbalanced dataset.
-
"Why Should I Trust You?" Explaining the Predictions of Any Classifier by Ribeiro et al.
-
They outline solutions for "trusting a prediction" and "trusting a model".
-
Local Interpretable Model-agnostic Explanations (LIME) is introduced to identify an interpretable model over the interpretable representation that is locally faithful to the classifier.
-
For "trusting a model", they must look more at a global view by explaining a set of individual instances. To establish global importance I, they want the features that are present in many instances to have higher
importance.