Seminar: Graduate Seminar

ECE Women Community

Adversarially Robust Conformal Prediction

Date: July,03,2022 Start Time: 12:30 - 13:30
Location: 1061, Meyer Building
Add to:
Lecturer: Asaf Gendler
Affiliations: The Andrew and Erna Viterbi Faculty of Electrical & Computer Engineering

Conformal prediction is a model-agnostic tool for constructing prediction sets that are valid under the common i.i.d. assumption, which has been applied to quantify the prediction uncertainty of deep net classifiers. In this talk, we generalize this framework to the case where adversaries exist during inference time, under which the i.i.d. assumption is grossly violated. By combining conformal prediction with randomized smoothing, our proposed method forms a prediction set with finite-sample coverage guarantee that holds for any data distribution with ℓ2-norm bounded adversarial noise, generated by any adversarial attack algorithm. The core idea is to bound the Lipschitz constant of the non-conformity score by smoothing it with Gaussian noise and leverage this knowledge to account for the effect of the unknown adversarial perturbation. We demonstrate the necessity of our method in the adversarial setting and the validity of our theoretical guarantee on three widely used benchmark data sets: CIFAR10, CIFAR100, and ImageNet.


* M.Sc. student under the supervision of Professor Yaniv Romano.


Article link:



All Seminars
Skip to content