Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Felladrin
/
gguf-sharded-zephyr-220m-dpo-full
like
0
GGUF
Inference Endpoints
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
Deploy
Use this model
main
gguf-sharded-zephyr-220m-dpo-full
1 contributor
History:
3 commits
Felladrin
Add sharded GGUF version of `zephyr-220m-dpo-full.Q8_0.gguf`
f2725df
verified
9 months ago
.gitattributes
Safe
2.14 kB
Add sharded GGUF version of `zephyr-220m-dpo-full.Q8_0.gguf`
9 months ago
README.md
Safe
199 Bytes
Create README.md
9 months ago
zephyr-220m-dpo-full.Q8_0.shard-00001-of-00007.gguf
Safe
35.7 MB
LFS
Add sharded GGUF version of `zephyr-220m-dpo-full.Q8_0.gguf`
9 months ago
zephyr-220m-dpo-full.Q8_0.shard-00002-of-00007.gguf
Safe
35 MB
LFS
Add sharded GGUF version of `zephyr-220m-dpo-full.Q8_0.gguf`
9 months ago
zephyr-220m-dpo-full.Q8_0.shard-00003-of-00007.gguf
Safe
32.3 MB
LFS
Add sharded GGUF version of `zephyr-220m-dpo-full.Q8_0.gguf`
9 months ago
zephyr-220m-dpo-full.Q8_0.shard-00004-of-00007.gguf
Safe
32.3 MB
LFS
Add sharded GGUF version of `zephyr-220m-dpo-full.Q8_0.gguf`
9 months ago
zephyr-220m-dpo-full.Q8_0.shard-00005-of-00007.gguf
Safe
32.3 MB
LFS
Add sharded GGUF version of `zephyr-220m-dpo-full.Q8_0.gguf`
9 months ago
zephyr-220m-dpo-full.Q8_0.shard-00006-of-00007.gguf
Safe
32.3 MB
LFS
Add sharded GGUF version of `zephyr-220m-dpo-full.Q8_0.gguf`
9 months ago
zephyr-220m-dpo-full.Q8_0.shard-00007-of-00007.gguf
Safe
32.3 MB
LFS
Add sharded GGUF version of `zephyr-220m-dpo-full.Q8_0.gguf`
9 months ago