Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
GoldenLlama
/
krx_qwen2.5_7b_it_v2
like
0
Text Generation
PyTorch
amphora/krx-sample-instructions
Korean
qwen2
krx
unsloth
trl
sft
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
0aea6c7
krx_qwen2.5_7b_it_v2
1 contributor
History:
5 commits
GoldenLlama
Trained with Unsloth
0aea6c7
verified
about 19 hours ago
.gitattributes
Safe
1.52 kB
initial commit
about 20 hours ago
README.md
Safe
332 Bytes
Trained with Unsloth
about 19 hours ago
added_tokens.json
Safe
632 Bytes
Upload tokenizer
about 20 hours ago
config.json
Safe
803 Bytes
Trained with Unsloth
about 19 hours ago
generation_config.json
Safe
266 Bytes
Trained with Unsloth
about 19 hours ago
merges.txt
Safe
1.67 MB
Upload tokenizer
about 20 hours ago
pytorch_model-00001-of-00004.bin
Safe
pickle
Detected Pickle imports (3)
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
,
"torch.HalfStorage"
What is a pickle import?
4.88 GB
LFS
Trained with Unsloth
about 19 hours ago
pytorch_model-00002-of-00004.bin
Safe
pickle
Detected Pickle imports (3)
"torch.HalfStorage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
4.93 GB
LFS
Trained with Unsloth
about 19 hours ago
pytorch_model-00003-of-00004.bin
Safe
pickle
Detected Pickle imports (3)
"torch.HalfStorage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
4.33 GB
LFS
Trained with Unsloth
about 19 hours ago
pytorch_model-00004-of-00004.bin
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
1.09 GB
LFS
Trained with Unsloth
about 19 hours ago
pytorch_model.bin.index.json
Safe
27.8 kB
Trained with Unsloth
about 19 hours ago
special_tokens_map.json
Safe
613 Bytes
Upload tokenizer
about 20 hours ago
tokenizer.json
Safe
7.03 MB
Upload tokenizer
about 20 hours ago
tokenizer_config.json
Safe
7.51 kB
Upload tokenizer
about 20 hours ago
vocab.json
Safe
2.78 MB
Upload tokenizer
about 20 hours ago