Update README.md
Browse files
README.md
CHANGED
@@ -36,7 +36,9 @@ More information needed
|
|
36 |
|
37 |
### How to use
|
38 |
Here is how to use this model:
|
|
|
39 |
``` python
|
|
|
40 |
from transformers import AutoTokenizer
|
41 |
from transformers import AutoModelForCausalLM
|
42 |
import torch
|
@@ -86,11 +88,14 @@ def translate_es_inclusivo(exclusive_text):
|
|
86 |
input_text = 'Los alumnos atienden a sus profesores'
|
87 |
|
88 |
print(translate_es_inclusivo(input_text))
|
|
|
89 |
``` python
|
90 |
|
91 |
|
92 |
As it is a heavy model, you may want to use it in 4-bits:
|
|
|
93 |
``` python
|
|
|
94 |
from transformers import AutoTokenizer
|
95 |
from transformers import AutoModelForCausalLM
|
96 |
from transformers import BitsAndBytesConfig
|
@@ -153,6 +158,7 @@ def translate_es_inclusivo(exclusive_text):
|
|
153 |
input_text = 'Los alumnos atienden a sus profesores'
|
154 |
|
155 |
print(translate_es_inclusivo(input_text))
|
|
|
156 |
``` python
|
157 |
|
158 |
## Training and evaluation data
|
|
|
36 |
|
37 |
### How to use
|
38 |
Here is how to use this model:
|
39 |
+
|
40 |
``` python
|
41 |
+
|
42 |
from transformers import AutoTokenizer
|
43 |
from transformers import AutoModelForCausalLM
|
44 |
import torch
|
|
|
88 |
input_text = 'Los alumnos atienden a sus profesores'
|
89 |
|
90 |
print(translate_es_inclusivo(input_text))
|
91 |
+
|
92 |
``` python
|
93 |
|
94 |
|
95 |
As it is a heavy model, you may want to use it in 4-bits:
|
96 |
+
|
97 |
``` python
|
98 |
+
|
99 |
from transformers import AutoTokenizer
|
100 |
from transformers import AutoModelForCausalLM
|
101 |
from transformers import BitsAndBytesConfig
|
|
|
158 |
input_text = 'Los alumnos atienden a sus profesores'
|
159 |
|
160 |
print(translate_es_inclusivo(input_text))
|
161 |
+
|
162 |
``` python
|
163 |
|
164 |
## Training and evaluation data
|