All You Need Is Conformal Prediction
An important but easy-to-use tool for uncertainty quantification every data scientist should know.
We must know how certain a model is when it makes predictions as there is a risk associated with wrong predictions. Without quantifying the model’s uncertainty, an accurate prediction and a wild guess look the same. For example, a self-driving car must be certain that the driving path is free from obstacles. I have written about this in another article.
But how can we quantify the uncertainty of our model?
This is where Conformal Prediction comes into the picture. Conformal Prediction is a framework for uncertainty quantification. The approach can turn any point prediction into statistically valid prediction regions. The region can be a set of classes for classification or an interval for regression problems.
How do we turn a point forecast into a prediction region using Conformal Prediction?
Conformal Prediction uses past experience to determine the uncertainty of new predictions. To apply Conformal Prediction we need a non-conformity score, a significance level alpha, and a calibration set.