Quantifying Uncertainty in the Presence of Distribution Shifts
We propose a Bayesian framework for uncertainty estimation that explicitly accounts for covariate shifts. While conventional approaches rely on fixed priors, the key idea of our method is an adaptive prior, conditioned on both training and new covariates.This prior naturally increases uncertainty for inputs that lie far from the training distribution in regions where predictive performance is likely to degrade. To efficiently approximate the resulting posterior predictive distribution, we employ amortized variational inference. For training, we simulate a range of plausible covariate shifts using only the original dataset and construct synthetic environments by drawing small bootstrap samples from the training data.
Slavutsky, Y., & Blei, DM. (2025). "Quantifying Uncertainty in the Presence of Distribution Shifts." Arxiv Preprint.