YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co./docs/hub/model-cards#model-card-metadata)
Llama-2-13b SuperCOT lora checkpoints
These are my Llama-2-13b SuperCOT Lora checkpoints trained using QLora on the SuperCOT Dataset.
Architecture
- Model Architecture: Llama-2-13b
- Training Algorithm: QLora
Training Details
- Dataset: SuperCOT Dataset
- Datset type: alpaca
- Training Parameters: See Here
- Training Environment: Axolotl
- sequence_len: 4096
Acknowledgments
Special thanks to the creators of the datasets in SuperCOT. Additionally, thanks to Kaiokendev for curating the SuperCOT dataset. Thanks to the contributors of the Axolotl.
Stuff generated from axolotl:
library_name: peft
Training procedure
The following bitsandbytes
quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: bfloat16
The following bitsandbytes
quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: bfloat16
Framework versions
PEFT 0.5.0.dev0
PEFT 0.5.0.dev0
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.