Edit model card

Bert2Bert (Encoder-Decoder) on Liputan6 100k dataset

Dataset source: https://huggingface.co./datasets/fajrikoto/id_liputan6
Model used for Fine Tuning (Encoder-Decoder):
https://huggingface.co./indolem/indobert-base-uncased

Trained on 1x3090 @ 8 epoch (EarlyStopping Callbacks)

Train logs, metrics, and params: https://wandb.ai/willy030125/huggingface/runs/2qk3jtic
https://www.comet.com/willy030125/huggingface/560ed6ccde1240c8b4401918fd27253a
Eval results and Perplexity: eval_results.json

Usage:

from transformers import AutoTokenizer, EncoderDecoderModel
tokenizer = AutoTokenizer.from_pretrained("Willy030125/Bert2Bert_Liputan6_100k_10epoch_IndoBERT")
model = EncoderDecoderModel.from_pretrained("Willy030125/Bert2Bert_Liputan6_100k_10epoch_IndoBERT")
Downloads last month
4
Safetensors
Model size
250M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.