Document Type

Article

Publication Date

2022

Publication Source

Proceedings of the 2021 Undergraduate Mathematics Day

Volume

7

Inclusive pages

1-12

Abstract

There are many types of statistical inferences that can be used today: Frequentist, Bayesian, Fiducial, and others. However, Vovk introduced a new version of statistical inference known as Conformal Predictions. Conformal Predictions were designed to reduce the assumptions of standard prediction methods. Instead of assuming all observations are drawn independently and identically distributed, we instead assume exchangeability. Meaning, all N! possible orderings of our N observations are equally likely. This is more applicable to fields such as machine learning where assumptions may not be easily satisfied. In the case of binary classification, Vovk provided the nearest neighbors (NN) measure which is a ratio of in-class versus out-of-class distance. Later on, Papodopolous introduced normalizing constants for NN for the regression case, we extend this work to the classification case. We provide an asymptotic guarantee which shows what is known empirically. The normalization of NN produces smaller confidence sets on average compared to standard NN. A small synthetic simulation is also presented to shown the viability in a non-asymptotic case.

Keywords

Efficiency, conformal predictions, set prediction, asymptotic, confidence sets

Disciplines

Mathematics

Comments

Presented at University of Dayton Undergraduate Mathematics Day Nov. 6, 2021.

lovigabstract.pdf (113 kB)
Abstract only


Included in

Mathematics Commons

COinS