Fill-Mask
Transformers
PyTorch
xlm-roberta
Inference Endpoints
nljubesi commited on
Commit
da955ed
1 Parent(s): 22d7b25

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -1
README.md CHANGED
@@ -106,4 +106,11 @@ The procedure is explained in greater detail in the dedicated [benchmarking repo
106
 
107
  The following paper has been submitted for review:
108
 
109
- Nikola Ljubešić, Vit Suchomel, Peter Rupnik, Taja Kuzman, Rik van Noord. Language Models on a Diet: Cost-Efficient Development of Encoders for Closely-Related Languages via Additional Pretraining. 2024. Submitted.
 
 
 
 
 
 
 
 
106
 
107
  The following paper has been submitted for review:
108
 
109
+ ```
110
+ @misc{ljubesic2024language,
111
+ author = "Nikola Ljube\v{s}i\'{c}', Vit Suchomel, Peter Rupnik, Taja Kuzman, Rik van Noord",
112
+ title = "Language Models on a Diet: Cost-Efficient Development of Encoders for Closely-Related Languages via Additional Pretraining",
113
+ howpublished = "Submitted for review",
114
+ year = "2024",
115
+ }
116
+ ```