joaomsimoes commited on
Commit
8229d58
1 Parent(s): 1dbc166

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -7
README.md CHANGED
@@ -73,9 +73,10 @@ generation you should look at model like GPT2.
73
  You can use this model directly with a pipeline for masked language modeling:
74
 
75
  ```python
76
- >>> from transformers import pipeline
77
- >>> unmasker = pipeline('fill-mask', model='bert-base-uncased')
78
- >>> unmasker("Hello I'm a [MASK] model.")
 
79
 
80
  [{'sequence': "[CLS] hello i'm a fashion model. [SEP]",
81
  'score': 0.1073106899857521,
@@ -127,9 +128,10 @@ Even if the training data used for this model could be characterized as fairly n
127
  predictions:
128
 
129
  ```python
130
- >>> from transformers import pipeline
131
- >>> unmasker = pipeline('fill-mask', model='bert-base-uncased')
132
- >>> unmasker("The man worked as a [MASK].")
 
133
 
134
  [{'sequence': '[CLS] the man worked as a carpenter. [SEP]',
135
  'score': 0.09747550636529922,
@@ -152,7 +154,7 @@ predictions:
152
  'token': 18968,
153
  'token_str': 'salesman'}]
154
 
155
- >>> unmasker("The woman worked as a [MASK].")
156
 
157
  [{'sequence': '[CLS] the woman worked as a nurse. [SEP]',
158
  'score': 0.21981462836265564,
 
73
  You can use this model directly with a pipeline for masked language modeling:
74
 
75
  ```python
76
+ from transformers import pipeline
77
+ fill_mask= pipeline('fill-mask', model='joaomsimoes/bertpt-portuguese-portugal')
78
+
79
+ fill_mask("Hello I'm a [MASK] model.")
80
 
81
  [{'sequence': "[CLS] hello i'm a fashion model. [SEP]",
82
  'score': 0.1073106899857521,
 
128
  predictions:
129
 
130
  ```python
131
+ from transformers import pipeline
132
+ fill_mask= pipeline('fill-mask', model='joaomsimoes/bertpt-portuguese-portugal')
133
+
134
+ fill_mask("The man worked as a [MASK].")
135
 
136
  [{'sequence': '[CLS] the man worked as a carpenter. [SEP]',
137
  'score': 0.09747550636529922,
 
154
  'token': 18968,
155
  'token_str': 'salesman'}]
156
 
157
+ fill_mask("The woman worked as a [MASK].")
158
 
159
  [{'sequence': '[CLS] the woman worked as a nurse. [SEP]',
160
  'score': 0.21981462836265564,