전체 글
- ‘model.eval()’ vs ‘with torch.no_grad()’ 2019.07.29
- Understanding Ranking Loss, Contrastive Loss, Margin Loss, Triplet Loss, Hinge Loss and all those confusing names 2019.07.26
- Docker 2019.07.25
- Contrastive loss 2019.07.25
- triplet loss 2019.07.24
- embedding 2019.07.24
- Distance Metric Learning 2019.07.24
- Cross entropy 2019.07.24
‘model.eval()’ vs ‘with torch.no_grad()’
Understanding Ranking Loss, Contrastive Loss, Margin Loss, Triplet Loss, Hinge Loss and all those confusing names
https://gombru.github.io/2019/04/03/ranking_loss/
Understanding Ranking Loss, Contrastive Loss, Margin Loss, Triplet Loss, Hinge Loss and all those confusing names
After the success of my post Understanding Categorical Cross-Entropy Loss, Binary Cross-Entropy Loss, Softmax Loss, Logistic Loss, Focal Loss and all those confusing names, and after checking that Triplet Loss outperforms Cross-Entropy Loss in my main rese
gombru.github.io
'Dic' 카테고리의 다른 글
LOO (0) | 2019.07.29 |
---|---|
Understanding Categorical Cross-Entropy Loss, Binary Cross-Entropy Loss, Softmax Loss, Logistic Loss, Focal Loss and all those confusing names (0) | 2019.07.29 |
Contrastive loss (0) | 2019.07.25 |
triplet loss (0) | 2019.07.24 |
embedding (0) | 2019.07.24 |
Docker
서버에 켜져있는 컨테이너 확인
- docker ps
죽어있는 컨테이너 까지 모두 확인
- docker ps -a
서버에 존재하는 이미지들 확인
- docker images
attach 된 컨테이너에서 detach 하는 법 (container 켜져 있는 상태에서 나오기)
- (ctrl + p) + (ctrl + q)
컨테이너 끄면서 나오기
- exit
끈 컨테이너 살리기
- docker container start 컨테이너 이름
detach한 컨테이너의 로그만 확인하는 법
- docker logs -f 컨테이너_이름
detach한 컨테이너에 attach 하는 법
- docker attach 컨테이너_이름
죽어있는 컨테이너 살리는 법
- docker restart 컨테이너_이름
도커 컨테이너 삭제하기
- docker rm 컨테이너_이름
옵션들
- "-v 로컬_디렉토리:도커_디렉토리" 로컬 디렉토리를 도커 디렉토리로 바인딩
- "--name 컨테이너_이름" 컨테이너 이름 설정
- "-it" 상호작용 가능한 컨테이너
- "--gpus all" gpu driver 추가
일반적인 실행 방법
- docker run -it -v 로컬_디렉토리:도커_디렉토리 도커_이미지 실행시킬_파일
(ex. docker run -it --name docker_container -v /mnt/hdd0/temp/:/project/ jinhyeoplee/vclab-16.04-9.0-1.12-3.4.3:0.6 /bin/bash) - docker run -it --shm-size=256m oracle11g /bin/bash
'Docker' 카테고리의 다른 글
Dockerfile 로 Docker image 생성하는 방법 (0) | 2021.10.14 |
---|---|
현재 docker Container Commit 및 Push (0) | 2021.05.07 |
docker : insufficient shared memory error (shm) (0) | 2020.09.10 |
Contrastive loss
https://jdhao.github.io/2017/03/13/some_loss_and_explanations/
Some Loss Functions and Their Intuitive Explanations
Loss functions are frequently used in supervised machine learning to minimize the differences between the predicted output of the model and the ground truth labels. In other words, it is used to measure how good our model can predict the true class of a sa
jdhao.github.io
'Dic' 카테고리의 다른 글
Understanding Categorical Cross-Entropy Loss, Binary Cross-Entropy Loss, Softmax Loss, Logistic Loss, Focal Loss and all those confusing names (0) | 2019.07.29 |
---|---|
Understanding Ranking Loss, Contrastive Loss, Margin Loss, Triplet Loss, Hinge Loss and all those confusing names (0) | 2019.07.26 |
triplet loss (0) | 2019.07.24 |
embedding (0) | 2019.07.24 |
Distance Metric Learning (0) | 2019.07.24 |
triplet loss
https://wwiiiii.tistory.com/entry/Pairwise-Triplet-Loss
Pairwise / Triplet Loss
loss - 복사본 Pairwise ranking loss 친구한테 영화를 추천해야 하는 상황을 생각해보자. 친구가 이때까지 본 영화에 대해 매긴 평점이나 영화평을 통해, 많은 새로운 영화 후보 중 친구가 볼 법한 영화를 순위를..
wwiiiii.tistory.com
'Dic' 카테고리의 다른 글
Understanding Ranking Loss, Contrastive Loss, Margin Loss, Triplet Loss, Hinge Loss and all those confusing names (0) | 2019.07.26 |
---|---|
Contrastive loss (0) | 2019.07.25 |
embedding (0) | 2019.07.24 |
Distance Metric Learning (0) | 2019.07.24 |
Cross entropy (0) | 2019.07.24 |
embedding
https://towardsdatascience.com/neural-network-embeddings-explained-4d028e6f0526
Neural Network Embeddings Explained
How deep learning can represent War and Peace as a vector
towardsdatascience.com
'Dic' 카테고리의 다른 글
Contrastive loss (0) | 2019.07.25 |
---|---|
triplet loss (0) | 2019.07.24 |
Distance Metric Learning (0) | 2019.07.24 |
Cross entropy (0) | 2019.07.24 |
likelihood (0) | 2019.07.24 |
Distance Metric Learning
http://sanghyukchun.github.io/37/
Distance Metric Learning - README
Machine Learning 분야에는 KNN 등의 input data의 distance metric을 어떻게 설정하냐 따라에 크게 영향을 받는 알고리듬들이 종종 존재한다. 그런데, 대부분 이런 method들에서 주로 사용하는 distance metric은 Euclidean distance로, 이 metric은 근본적으로 데이터 하나와 다른 데이터 하나와의 관계만을 나타내기 때문에 실제 distribution으로 존재하는 데이터에는 적합하지 않은 경우가
sanghyukchun.github.io
'Dic' 카테고리의 다른 글
triplet loss (0) | 2019.07.24 |
---|---|
embedding (0) | 2019.07.24 |
Cross entropy (0) | 2019.07.24 |
likelihood (0) | 2019.07.24 |
Maximum likelihood estimation (0) | 2019.07.24 |
Cross entropy
https://curt-park.github.io/2018-09-19/loss-cross-entropy/
[손실함수] Binary Cross Entropy
확률, 정보이론 관점에서 살펴보는 Binary Cross Entropy 함수
curt-park.github.io
https://3months.tistory.com/436
Cross-entropy 의 이해: 정보이론과의 관계
Cross-entropy 의 이해: 정보이론과의 관계 1. 손실함수로서의 Cross-entropy 딥러닝에서 분류 모델에 대한 손실 함수로 cross-entropy 혹은 binary entropy 나 log loss 라는 것을 사용하게 된다. 손실함수로서의..
3months.tistory.com
https://machinelearningmastery.com/cross-entropy-for-machine-learning/
A Gentle Introduction to Cross-Entropy for Machine Learning
Cross-entropy is commonly used in machine learning as a loss function. Cross-entropy is a measure from the field of information theory, building upon entropy and generally calculating the difference between two probability distributions. It is closely rela
machinelearningmastery.com
'Dic' 카테고리의 다른 글
embedding (0) | 2019.07.24 |
---|---|
Distance Metric Learning (0) | 2019.07.24 |
likelihood (0) | 2019.07.24 |
Maximum likelihood estimation (0) | 2019.07.24 |
Hard negative mining (0) | 2019.07.23 |