Three Easy Steps to Understand Conformal Prediction (CP), Conformity Score, Python Implementation
* Conformal prediction is a framework for quantifying uncertainty in the predictions made by arbitrary machine learning algorithms
* At its core, conformal prediction leverages statistical principles to establish a reliable measure of prediction uncertainty
* It does *not* rely on specific modeling assumptions, enhancing its applicability
* In this notebook, we will discuss three main ingredients of conformal prediction or CP: order statistics, calibration set, and conformity score
* We also discuss the differences between the i.i.d. assumption and exchangeability for conformal prediction
* Python implementation for building prediction sets with softmax scores and using Jackknife+ for full conformal prediction
#conformalprediction #machinelearning #neuralnetworks
Link for the notebook: https://github.com/farhad-pourkamali/YouTube/blob/main/conformal_prediction.ipynb