Fill-Mask
Transformers
PyTorch
xlm-roberta
Inference Endpoints
fenchri commited on
Commit
368133c
1 Parent(s): 9a1c8c7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -100,7 +100,9 @@ We randomly choose 100 sentences from each language to serve as a validation set
100
  ## Usage
101
 
102
  The current model can be used for further fine-tuning on downstream tasks.
103
- In the paper, we focused on entity-related tasks, such as NER, Word Sense Disambiguation, Fact Retrieval and Slot Filling.
 
 
104
 
105
  ## How to Get Started with the Model
106
 
 
100
  ## Usage
101
 
102
  The current model can be used for further fine-tuning on downstream tasks.
103
+ In the paper, we focused on entity-related tasks, such as NER, Word Sense Disambiguation and Slot Filling.
104
+
105
+ Alternatively, it can be used directly (no fine-tuning) for probing tasks, i.e. predict missing words, such as [X-FACTR](https://aclanthology.org/2020.emnlp-main.479/).
106
 
107
  ## How to Get Started with the Model
108