Edit model card

NaturalRoBERTa

This is a pre-trained model of type RoBERTa. NaturalRoBERTa is built on a dataset obtained from open sources: three news sub-corpuses Taiga (Lenta.ru, Interfax, N+1) and Russian Wikipedia texts.

Evaluation

This model was evaluated on RussianSuperGLUE tests:

Task Result Metrics
LiDiRus 0,0 Matthews Correlation Coefficient
RCB 0,217 / 0,484 F1 / Accuracy
PARus 0,498 Accuracy
TERRa 0,487 Accuracy
RUSSE 0,587 Accuracy
RWSD 0,669 Accuracy
Downloads last month
16
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Dataset used to train tay-yozhik/NaturalRoBERTa