Edit model card

Indonesian GPT-2 finetuned on Indonesian poems

This is the Indonesian gpt2-small model fine-tuned to Indonesian poems. The dataset can be found in here All training was done on Google Colab Jupyter Notebook (soon).

The dataset is splitted into two subset with details belows:

split count (examples) percentage
train 7,358 80%
validation 1,890 20%

Evaluation results

The model evaluation results after 10 epochs are as follows:

dataset train/loss eval/loss eval perplexity
id puisi 3.324700 3.502665 33.20

The logs can be found in wandb page here or tensorboard here

Downloads last month
137
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.