metadata
base_model: distilbert/distilbert-base-uncased
datasets:
- stanford-nlp/snli
language: en
library_name: transformers
license: apache-2.0
metrics:
- accuracy
pipeline_tag: sentence-similarity
datasets_description:
- SNLI
model-index:
- name: distilbert-base-uncased-snli
results:
- task:
type: natural-language-inference
dataset:
name: stanford-nlp/snli
type: nli
metrics:
- type: accuracy
value: 0.8979
Model Card for distilbert-base-uncased
Model Details
Model Description
A fine-tuned version of distilbert/distilbert-base-uncased
using the stanford-nlp/snli
dataset.
- Developed by: Karl Weinmeister
- Language(s) (NLP): en
- License: apache-2.0
- Finetuned from model [optional]: distilbert/distilbert-base-uncased
Training Hyperparameters
- Training regime: The model was trained for 5 epochs with batch size 128.