Loss function when the output is a single probability

I have a regression problem where the output y is a single probability, i.e. real number that varies in the interval [0, 1]

While using L1 or L2 loss will very likely work well, I feel that they are not the most appropriate options considering that the range [0, 1] is already well defined.

Is Binary Cross Entropy (BCE Loss in pytorch) the most appropriate in this case?

Topic pytorch loss-function regression

Category Data Science


Predicting probabilities is can be framed as a beta regression.

That is a separate issue than adding a regularization term (i.e., L1 or L2).


At first I was going to say:

It doesn't make sense to use use cross entropy loss in a regression problem!

See explanation here.

But then I realised that if you are really trying to do regression on probabilities it could have some sense.

But still, why would you use it instead of L1, L2? So maybe try it and let me know if it works better!

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.