Learning to Learn

  • 분류 전체보기 (215)
    • Research Note (0)
      • KD for openset (0)
      • instance learning for opens.. (0)
      • Use Background as openset (0)
      • Vision Transformer (0)
      • Apply Fourier Transform Sam.. (0)
      • Check Openset sample model (0)
    • Paper (34)
      • Openset recogniton (8)
      • Out-of-distribution (2)
      • Imbalanced data (7)
      • Semi-Supervised learning (1)
      • Unsupervised learning (9)
      • Knowledge distillation (1)
      • ETC (6)
    • Project (0)
      • ICT (0)
      • LG (0)
      • AI Challenge (0)
    • Tutorial (1)
    • Code (2)
    • Deep learning (1)
    • Math (6)
      • Linear Algebra (1)
      • Baysian learning (1)
      • Causal Inference (3)
      • Optimizaiton (1)
    • Python (25)
    • Algorithm (0)
    • Pytorch (8)
    • Terminal (13)
    • Docker (4)
    • CUDA (1)
    • Dic (102)
  • 홈
  • 태그
  • 방명록
/ /

Attention is all you need

2019. 9. 30. 20:42
728x90

https://mlexplained.com/2017/12/29/attention-is-all-you-need-explained/

 

Paper Dissected: “Attention is All You Need” Explained

“Attention is All You Need”, is an influential paper with a catchy title that fundamentally changed the field of machine translation. Previously, RNNs were regarded as the go-to archite…

mlexplained.com

 

http://jalammar.github.io/illustrated-transformer/

 

The Illustrated Transformer

Discussions: Hacker News (65 points, 4 comments), Reddit r/MachineLearning (29 points, 3 comments) Translations: Chinese (Simplified), Korean Watch: MIT’s Deep Learning State of the Art lecture referencing this post In the previous post, we looked at Atten

jalammar.github.io

 

728x90
저작자표시 (새창열림)

'Dic' 카테고리의 다른 글

PCA  (0) 2019.10.24
Spectral Clustering  (0) 2019.10.24
Knowledge distillation  (0) 2019.09.27
Clustering Evaluation  (0) 2019.09.23
F-measure  (0) 2019.09.23

+ Recent posts

Powered by Tistory, Designed by wallel
Rss Feed and Twitter, Facebook, Youtube, Google+

티스토리툴바