Update README.md
Browse files
README.md
CHANGED
@@ -7,16 +7,21 @@ By original, it means with interleaved rotary (option: rotary_interleave=True)
|
|
7 |
You need to install OpenNMT-py, instructions are here: https://github.com/OpenNMT/OpenNMT-py
|
8 |
|
9 |
Running inference:
|
|
|
10 |
Create a text input file with prompts (ex: "Show me some attractions in Boston")
|
11 |
then run:
|
12 |
onmt_translate --config mistral-inference.yaml --src input.txt --output output.txt
|
|
|
13 |
|
14 |
Running MMLU evaluation:
|
|
|
15 |
If you git clone the OpenNMT-py repo then you can run:
|
16 |
python eval_llm/MMLU/run_mmlu_opennmt.py --config mistral-inference.yaml
|
17 |
For this use case make sure you use max_length=1 in the config file
|
|
|
18 |
|
19 |
Finetuning:
|
|
|
20 |
Read this tuto: https://forum.opennmt.net/t/finetuning-llama-7b-13b-or-mosaicml-mpt-7b-reproduce-vicuna-alpaca/5272/56
|
21 |
onmt_train --config mistral-finetuning.yaml
|
22 |
-
|
|
|
7 |
You need to install OpenNMT-py, instructions are here: https://github.com/OpenNMT/OpenNMT-py
|
8 |
|
9 |
Running inference:
|
10 |
+
```
|
11 |
Create a text input file with prompts (ex: "Show me some attractions in Boston")
|
12 |
then run:
|
13 |
onmt_translate --config mistral-inference.yaml --src input.txt --output output.txt
|
14 |
+
```
|
15 |
|
16 |
Running MMLU evaluation:
|
17 |
+
```
|
18 |
If you git clone the OpenNMT-py repo then you can run:
|
19 |
python eval_llm/MMLU/run_mmlu_opennmt.py --config mistral-inference.yaml
|
20 |
For this use case make sure you use max_length=1 in the config file
|
21 |
+
```
|
22 |
|
23 |
Finetuning:
|
24 |
+
```
|
25 |
Read this tuto: https://forum.opennmt.net/t/finetuning-llama-7b-13b-or-mosaicml-mpt-7b-reproduce-vicuna-alpaca/5272/56
|
26 |
onmt_train --config mistral-finetuning.yaml
|
27 |
+
```
|