Update README.md
Browse files
README.md
CHANGED
@@ -104,7 +104,7 @@ The model was trained with the parameters:
|
|
104 |
**2.distilation**
|
105 |
- 교사 모델 : paraphrase-multilingual-mpnet-base-v2(max_token_len:128)
|
106 |
- 말뭉치 : news_talk_en_ko_train.tsv (영어-한국어 대화-뉴스 병렬 말뭉치 : 1.38M)
|
107 |
-
- Param : **lr: 5e-5, epochs: 10, train_batch:
|
108 |
- 훈련코드 [여기](https://github.com/kobongsoo/BERT/blob/master/sbert/sbert-distillaton.ipynb) 참조
|
109 |
|
110 |
**3.NLI**
|
|
|
104 |
**2.distilation**
|
105 |
- 교사 모델 : paraphrase-multilingual-mpnet-base-v2(max_token_len:128)
|
106 |
- 말뭉치 : news_talk_en_ko_train.tsv (영어-한국어 대화-뉴스 병렬 말뭉치 : 1.38M)
|
107 |
+
- Param : **lr: 5e-5, eps: 1e-8, epochs: 10, train_batch: 32, eval/test_batch: 64, max_token_len: 128(교사모델이 128이므로 맟춰줌)**
|
108 |
- 훈련코드 [여기](https://github.com/kobongsoo/BERT/blob/master/sbert/sbert-distillaton.ipynb) 참조
|
109 |
|
110 |
**3.NLI**
|