https://jdhao.github.io/2017/03/13/some_loss_and_explanations/
Some Loss Functions and Their Intuitive Explanations
Loss functions are frequently used in supervised machine learning to minimize the differences between the predicted output of the model and the ground truth labels. In other words, it is used to measure how good our model can predict the true class of a sa
jdhao.github.io
'Dic' 카테고리의 다른 글
Understanding Categorical Cross-Entropy Loss, Binary Cross-Entropy Loss, Softmax Loss, Logistic Loss, Focal Loss and all those confusing names (0) | 2019.07.29 |
---|---|
Understanding Ranking Loss, Contrastive Loss, Margin Loss, Triplet Loss, Hinge Loss and all those confusing names (0) | 2019.07.26 |
triplet loss (0) | 2019.07.24 |
embedding (0) | 2019.07.24 |
Distance Metric Learning (0) | 2019.07.24 |