YAML Metadata
Error:
"datasets[0]" with value "Fake News https://www.kaggle.com/datasets/clmentbisaillon/fake-and-real-news-dataset" is not valid. If possible, use a dataset id from https://hf.co/datasets.
Model description:
Distilbert is created with knowledge distillation during the pre-training phase which reduces the size of a BERT model by 40%, while retaining 97% of its language understanding. It's smaller, faster than Bert and any other Bert-based model.
Distilbert-base-uncased finetuned on the fake news dataset with below Hyperparameters
learning rate 5e-5,
batch size 32,
num_train_epochs=2,
Full code available @ DistilBert-FakeNews
Dataset available @ Fake News dataset
- Downloads last month
- 10
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.