Robust Correction of Sampling Bias using Cumulative Distribution Functions

Abstract

Varying domains and biased datasets can lead to differences between the training and the target distributions, known as covariate shift. Current approaches for alleviating this often rely on estimating the ratio of training and target probability density functions. These techniques require parameter tuning and can be unstable across different datasets. We present a new method for handling covariate shift using the empirical cumulative distribution function estimates of the target distribution by a rigorous generalization of a recent idea proposed by Vapnik and Izmailov. Further, we show experimentally that our method is more robust in its predictions, is not reliant on parameter tuning and shows similar classification performance compared to the current state-of-the-art techniques on synthetic and real datasets.

Publication
In Advances in Neural Information Processing Systems 33
Bijan Mazaheri
Bijan Mazaheri
Postdoctoral Associate

My interests include mixture models, high level data fusion, and stability to distribution shift - usually through the lense of causality.