Learning to Learn

  • 분류 전체보기 (215)
    • Research Note (0)
      • KD for openset (0)
      • instance learning for opens.. (0)
      • Use Background as openset (0)
      • Vision Transformer (0)
      • Apply Fourier Transform Sam.. (0)
      • Check Openset sample model (0)
    • Paper (34)
      • Openset recogniton (8)
      • Out-of-distribution (2)
      • Imbalanced data (7)
      • Semi-Supervised learning (1)
      • Unsupervised learning (9)
      • Knowledge distillation (1)
      • ETC (6)
    • Project (0)
      • ICT (0)
      • LG (0)
      • AI Challenge (0)
    • Tutorial (1)
    • Code (2)
    • Deep learning (1)
    • Math (6)
      • Linear Algebra (1)
      • Baysian learning (1)
      • Causal Inference (3)
      • Optimizaiton (1)
    • Python (25)
    • Algorithm (0)
    • Pytorch (8)
    • Terminal (13)
    • Docker (4)
    • CUDA (1)
    • Dic (102)
  • 홈
  • 태그
  • 방명록
/ /

Understanding Categorical Cross-Entropy Loss, Binary Cross-Entropy Loss, Softmax Loss, Logistic Loss, Focal Loss and all those confusing names

2019. 7. 29. 14:25
728x90

https://gombru.github.io/2018/05/23/cross_entropy_loss/

 

Understanding Categorical Cross-Entropy Loss, Binary Cross-Entropy Loss, Softmax Loss, Logistic Loss, Focal Loss and all those c

People like to use cool names which are often confusing. When I started playing with CNN beyond single label classification, I got confused with the different names and formulations people write in their papers, and even with the loss layer names of the de

gombru.github.io

 

728x90

'Dic' 카테고리의 다른 글

Mahalanobis distance  (0) 2019.07.29
LOO  (0) 2019.07.29
Understanding Ranking Loss, Contrastive Loss, Margin Loss, Triplet Loss, Hinge Loss and all those confusing names  (0) 2019.07.26
Contrastive loss  (0) 2019.07.25
triplet loss  (0) 2019.07.24

+ Recent posts

Powered by Tistory, Designed by wallel
Rss Feed and Twitter, Facebook, Youtube, Google+

티스토리툴바