My model is too confident. How to fix it?

I've got a classification model trained on ~50k examples, but when I take a look at the test results, it is too confident. Almost every prediction gives 99% confidence. It doesn't even matter prediction is true or false. Regularization techniques didn't help. How can I fix this? I can't find a threshold for my production usage.

3 votesLP190.00
1 Answers

Try to use label smoothing. If your model is too confident, label smoothing will help you decrease the prediction values by adding noise on the labels. If you remember the loss function for your classification model, you will see it takes the exact prediction of true value, cause you use one hot vector. It just adds a little noise from uniform distribution, which is controlable by hyperparameter.

Couldn't find what you were looking for?and we will find an expert to answer.
How helpful was this page?