Learning to Learn

  • 분류 전체보기 (215)
    • Research Note (0)
      • KD for openset (0)
      • instance learning for opens.. (0)
      • Use Background as openset (0)
      • Vision Transformer (0)
      • Apply Fourier Transform Sam.. (0)
      • Check Openset sample model (0)
    • Paper (34)
      • Openset recogniton (8)
      • Out-of-distribution (2)
      • Imbalanced data (7)
      • Semi-Supervised learning (1)
      • Unsupervised learning (9)
      • Knowledge distillation (1)
      • ETC (6)
    • Project (0)
      • ICT (0)
      • LG (0)
      • AI Challenge (0)
    • Tutorial (1)
    • Code (2)
    • Deep learning (1)
    • Math (6)
      • Linear Algebra (1)
      • Baysian learning (1)
      • Causal Inference (3)
      • Optimizaiton (1)
    • Python (25)
    • Algorithm (0)
    • Pytorch (8)
    • Terminal (13)
    • Docker (4)
    • CUDA (1)
    • Dic (102)
  • 홈
  • 태그
  • 방명록
/ /

Understanding Ranking Loss, Contrastive Loss, Margin Loss, Triplet Loss, Hinge Loss and all those confusing names

2019. 7. 26. 13:37
728x90

https://gombru.github.io/2019/04/03/ranking_loss/

 

Understanding Ranking Loss, Contrastive Loss, Margin Loss, Triplet Loss, Hinge Loss and all those confusing names

After the success of my post Understanding Categorical Cross-Entropy Loss, Binary Cross-Entropy Loss, Softmax Loss, Logistic Loss, Focal Loss and all those confusing names, and after checking that Triplet Loss outperforms Cross-Entropy Loss in my main rese

gombru.github.io

 

728x90

'Dic' 카테고리의 다른 글

LOO  (0) 2019.07.29
Understanding Categorical Cross-Entropy Loss, Binary Cross-Entropy Loss, Softmax Loss, Logistic Loss, Focal Loss and all those confusing names  (0) 2019.07.29
Contrastive loss  (0) 2019.07.25
triplet loss  (0) 2019.07.24
embedding  (0) 2019.07.24

+ Recent posts

Powered by Tistory, Designed by wallel
Rss Feed and Twitter, Facebook, Youtube, Google+

티스토리툴바