Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Finnish-NLP
/
roberta-large-finnish
like
4
Follow
Finnish-NLP
36
Fill-Mask
Transformers
PyTorch
JAX
TensorBoard
Finnish-NLP/mc4_fi_cleaned
wikipedia
Finnish
roberta
finnish
Inference Endpoints
arxiv:
1907.11692
License:
apache-2.0
Model card
Files
Files and versions
Metrics
Training metrics
Community
5
Train
Deploy
Use this model
c21807e
roberta-large-finnish
2 contributors
History:
126 commits
aapot
Update README.md
c21807e
almost 3 years ago
.gitattributes
737 Bytes
Add tokenizer
about 3 years ago
README.md
8.23 kB
Update README.md
almost 3 years ago
config.json
703 Bytes
Add pytorch model
about 3 years ago
events.out.tfevents.1629959691.t1v-n-1ae8dadb-w-0.364831.0.v2
37.8 MB
LFS
Saving weights and logs of epoch 0
about 3 years ago
events.out.tfevents.1630151615.t1v-n-1ae8dadb-w-0.8890.0.v2
75.9 MB
LFS
Saving weights and logs of epoch 0
about 3 years ago
events.out.tfevents.1630324517.t1v-n-1ae8dadb-w-0.551349.0.v2
40 Bytes
LFS
Saving weights and logs of step 10000
about 3 years ago
events.out.tfevents.1630325064.t1v-n-1ae8dadb-w-0.554071.0.v2
37.8 MB
LFS
Saving weights and logs of epoch 0
about 3 years ago
flax_model.msgpack
1.42 GB
LFS
Saving weights and logs of epoch 0
about 3 years ago
flax_model_to_pytorch.py
779 Bytes
Add pytorch model
about 3 years ago
merges.txt
519 kB
Saving weights and logs of step 10000
about 3 years ago
pytorch_model.bin
pickle
Detected Pickle imports (4)
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
,
"torch.FloatStorage"
,
"torch.LongStorage"
What is a pickle import?
1.42 GB
LFS
Add latest pytorch model
about 3 years ago
run_mlm_flax.py
33.1 kB
Saving weights and logs of step 10000
about 3 years ago
special_tokens_map.json
239 Bytes
Saving weights and logs of step 10000
about 3 years ago
start_train.sh
903 Bytes
Saving weights and logs of step 10000
about 3 years ago
tokenizer.json
1.48 MB
Saving weights and logs of step 10000
about 3 years ago
tokenizer_config.json
292 Bytes
Saving weights and logs of step 10000
about 3 years ago
train_tokenizer.py
995 Bytes
Add tokenizer
about 3 years ago
vocab.json
861 kB
Saving weights and logs of step 10000
about 3 years ago