728x90
https://medium.com/neuralmachine/knowledge-distillation-dc241d7c2322
Knowledge Distillation
Knowledge distillation is model compression method in which a small model is trained to mimic a pretrained, larger model.
medium.com
728x90
'Dic' 카테고리의 다른 글
Spectral Clustering (0) | 2019.10.24 |
---|---|
Attention is all you need (0) | 2019.09.30 |
Clustering Evaluation (0) | 2019.09.23 |
F-measure (0) | 2019.09.23 |
kernel (0) | 2019.09.19 |