YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co./docs/hub/model-cards#model-card-metadata)
Quantization made by Richard Erkhov.
DOCTOR - bnb 4bits
- Model creator: https://huggingface.co./DLI-Lab/
- Original model: https://huggingface.co./DLI-Lab/DOCTOR/
Original model description:
license: apache-2.0 datasets: - DLI-Lab/DONUT widget: - text: 'A: Hi, Viggo. How are you doing today?\nB: Hey, Yovani. I鈥檓 doing all right. Thanks for asking.\nA: No problem. I saw that you left your coffee mug on the counter this morning. Did you forget to take it with you?\nB: Yeah, I did. Thanks for grabbing it for me.\nA: No problem at all. I know how busy you are and I didn鈥檛 want you to have to come back for it later.\nB: You鈥檙e a lifesaver, Yovani. Seriously, thank you so much.' - example_title: 'example 1'
A dialogue commonsense reasoner that generates Chain-of-Thought knowledge in a multi-hop manner given a dialogue history. Our DOCTOR is trained with DONUT which is also available on huggingface.
Links for Reference
- Demo:https://dialoguecot.web.app/
- Repository:https://github.com/kyle8581/DialogueCoT
- Paper:https://arxiv.org/abs/2310.09343
- Point of Contact:[email protected]
For more details, you can look at our paper Dialogue Chain-of-Thought Distillation for Commonsense-aware Conversational Agents. If you find the following model helpful, please consider citing our paper!
BibTeX:
@misc{chae2023dialogue,
title={Dialogue Chain-of-Thought Distillation for Commonsense-aware Conversational Agents},
author={Hyungjoo Chae and Yongho Song and Kai Tzu-iunn Ong and Taeyoon Kwon and Minjin Kim and Youngjae Yu and Dongha Lee and Dongyeop Kang and Jinyoung Yeo},
year={2023},
eprint={2310.09343},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
- Downloads last month
- 0