Update README.md
Browse files
README.md
CHANGED
@@ -36,7 +36,7 @@ More information needed
|
|
36 |
|
37 |
### How to use
|
38 |
Here is how to use this model:
|
39 |
-
|
40 |
from transformers import AutoTokenizer
|
41 |
from transformers import AutoModelForCausalLM
|
42 |
import torch
|
@@ -86,11 +86,11 @@ def translate_es_inclusivo(exclusive_text):
|
|
86 |
input_text = 'Los alumnos atienden a sus profesores'
|
87 |
|
88 |
print(translate_es_inclusivo(input_text))
|
89 |
-
|
90 |
|
91 |
|
92 |
As it is a heavy model, you may want to use it in 4-bits:
|
93 |
-
|
94 |
from transformers import AutoTokenizer
|
95 |
from transformers import AutoModelForCausalLM
|
96 |
from transformers import BitsAndBytesConfig
|
@@ -153,7 +153,7 @@ def translate_es_inclusivo(exclusive_text):
|
|
153 |
input_text = 'Los alumnos atienden a sus profesores'
|
154 |
|
155 |
print(translate_es_inclusivo(input_text))
|
156 |
-
|
157 |
|
158 |
## Training and evaluation data
|
159 |
|
|
|
36 |
|
37 |
### How to use
|
38 |
Here is how to use this model:
|
39 |
+
``` python
|
40 |
from transformers import AutoTokenizer
|
41 |
from transformers import AutoModelForCausalLM
|
42 |
import torch
|
|
|
86 |
input_text = 'Los alumnos atienden a sus profesores'
|
87 |
|
88 |
print(translate_es_inclusivo(input_text))
|
89 |
+
``` python
|
90 |
|
91 |
|
92 |
As it is a heavy model, you may want to use it in 4-bits:
|
93 |
+
``` python
|
94 |
from transformers import AutoTokenizer
|
95 |
from transformers import AutoModelForCausalLM
|
96 |
from transformers import BitsAndBytesConfig
|
|
|
153 |
input_text = 'Los alumnos atienden a sus profesores'
|
154 |
|
155 |
print(translate_es_inclusivo(input_text))
|
156 |
+
``` python
|
157 |
|
158 |
## Training and evaluation data
|
159 |
|