Commit
•
57ae27c
1
Parent(s):
966ec27
Change BOS token from 0 to 2 as BOS token is equal to EOS for OPT. See: https://github.com/huggingface/transformers/issues/17431 (#2)
Browse files- Change BOS token from 0 to 2 as BOS token is equal to EOS for OPT. See: https://github.com/huggingface/transformers/issues/17431 (4fa49576a9456c8140a7c487892ed8cc23e7cfb8)
Co-authored-by: Patrick von Platen <[email protected]>
- config.json +1 -1
config.json
CHANGED
@@ -6,7 +6,7 @@
|
|
6 |
"OPTForCausalLM"
|
7 |
],
|
8 |
"attention_dropout": 0.0,
|
9 |
-
"bos_token_id":
|
10 |
"do_layer_norm_before": true,
|
11 |
"dropout": 0.1,
|
12 |
"eos_token_id": 2,
|
|
|
6 |
"OPTForCausalLM"
|
7 |
],
|
8 |
"attention_dropout": 0.0,
|
9 |
+
"bos_token_id": 2,
|
10 |
"do_layer_norm_before": true,
|
11 |
"dropout": 0.1,
|
12 |
"eos_token_id": 2,
|