Update README.md
Browse files
README.md
CHANGED
@@ -37,7 +37,7 @@ More information needed
|
|
37 |
### How to use
|
38 |
Here is how to use this model:
|
39 |
|
40 |
-
|
41 |
|
42 |
from transformers import AutoTokenizer
|
43 |
from transformers import AutoModelForCausalLM
|
@@ -89,12 +89,12 @@ input_text = 'Los alumnos atienden a sus profesores'
|
|
89 |
|
90 |
print(translate_es_inclusivo(input_text))
|
91 |
|
92 |
-
|
93 |
|
94 |
|
95 |
As it is a heavy model, you may want to use it in 4-bits:
|
96 |
|
97 |
-
|
98 |
|
99 |
from transformers import AutoTokenizer
|
100 |
from transformers import AutoModelForCausalLM
|
@@ -159,7 +159,7 @@ input_text = 'Los alumnos atienden a sus profesores'
|
|
159 |
|
160 |
print(translate_es_inclusivo(input_text))
|
161 |
|
162 |
-
|
163 |
|
164 |
## Training and evaluation data
|
165 |
|
|
|
37 |
### How to use
|
38 |
Here is how to use this model:
|
39 |
|
40 |
+
```python
|
41 |
|
42 |
from transformers import AutoTokenizer
|
43 |
from transformers import AutoModelForCausalLM
|
|
|
89 |
|
90 |
print(translate_es_inclusivo(input_text))
|
91 |
|
92 |
+
```
|
93 |
|
94 |
|
95 |
As it is a heavy model, you may want to use it in 4-bits:
|
96 |
|
97 |
+
``` python
|
98 |
|
99 |
from transformers import AutoTokenizer
|
100 |
from transformers import AutoModelForCausalLM
|
|
|
159 |
|
160 |
print(translate_es_inclusivo(input_text))
|
161 |
|
162 |
+
```
|
163 |
|
164 |
## Training and evaluation data
|
165 |
|