What exactly is SuperCOT-LoRA
#2
by
FarziBuilder
- opened
hey folks! a noob here.
Is this is a dataset, a fine-tuned model, set of adapter weights.
What is this?
Yes.
It is a parameter efficient fine-tuning of LLaMA using low rank adapters with SuperCOT dataset. As for the usage, refer to the model card: it is primarily designed to make LLaMA perform better with Langchain by fine-tuning with chain-of-thought and code interpretation