sundanese-roberta-base / flax_to_torch.py
w11wo's picture
Saving weights and logs of epoch 1
2436252
raw
history blame contribute delete
234 Bytes
from transformers import RobertaForMaskedLM, AutoTokenizer
model = RobertaForMaskedLM.from_pretrained("./", from_flax=True)
model.save_pretrained("./")
tokenizer = AutoTokenizer.from_pretrained("./")
tokenizer.save_pretrained("./")