metadata
license: apache-2.0
datasets:
- databricks/databricks-dolly-15k
language:
- en
metrics:
- rouge
base_model:
- facebook/opt-6.7B
pipeline_tag: text-generation
KD-OPT-6.7B
KD-OPT-6.7B is a model distilled from OPT-13B on databricks-dolly-15k with token-level forward KLD.
It is used as a baseline for MiniLLM.
Other Baselines
Citation
@inproceedings{minillm,
title={MiniLLM: Knowledge Distillation of Large Language Models},
author={Gu, Yuxian and Dong, Li and Wei, Furu and Huang, Minlie},
booktitle={Proceedings of ICLR},
year={2024}
}