Local Temperature Scaling
Deep segmentation networks typically create their outputs via a softmax. Hence, the assignment appears to be probabilistic. However, it has been shown in literatute that interpreting these softmax outputs as label probabilities is not reliable and that they tend to be overly confident. A simple approach to counteract this phenomenon for classification problems is to use Temperature Scaling which rescales the inputs to the softmax via a global temperatute constant so that the resulting outputs are consistent with the empirically observed labeling performance, i.e., the softmax output become calibrated probabilites. Our Local Temperature Scaling approach extends this principle to image segmentation. In particular, it allows for localized probability calibrations.
The Local Temperature Scaling repository can be found here: https://github.com/uncbiag/LTS