Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
RichardErkhov
/
ibm-granite_-_granite-20b-functioncalling-gguf
like
0
GGUF
Inference Endpoints
conversational
arxiv:
2407.00121
Model card
Files
Files and versions
Community
Deploy
Use this model
d69b0fd
ibm-granite_-_granite-20b-functioncalling-gguf
1 contributor
History:
16 commits
RichardErkhov
uploaded model
d69b0fd
verified
about 2 months ago
.gitattributes
Safe
2.65 kB
uploaded model
about 2 months ago
granite-20b-functioncalling.IQ3_M.gguf
Safe
9.59 GB
LFS
uploaded model
about 2 months ago
granite-20b-functioncalling.IQ3_S.gguf
Safe
8.93 GB
LFS
uploaded model
about 2 months ago
granite-20b-functioncalling.IQ3_XS.gguf
Safe
8.66 GB
LFS
uploaded model
about 2 months ago
granite-20b-functioncalling.IQ4_NL.gguf
Safe
11.7 GB
LFS
uploaded model
about 2 months ago
granite-20b-functioncalling.IQ4_XS.gguf
Safe
11.1 GB
LFS
uploaded model
about 2 months ago
granite-20b-functioncalling.Q2_K.gguf
Safe
7.93 GB
LFS
uploaded model
about 2 months ago
granite-20b-functioncalling.Q3_K.gguf
Safe
10.6 GB
LFS
uploaded model
about 2 months ago
granite-20b-functioncalling.Q3_K_L.gguf
Safe
11.7 GB
LFS
uploaded model
about 2 months ago
granite-20b-functioncalling.Q3_K_M.gguf
Safe
10.6 GB
LFS
uploaded model
about 2 months ago
granite-20b-functioncalling.Q3_K_S.gguf
Safe
8.93 GB
LFS
uploaded model
about 2 months ago
granite-20b-functioncalling.Q4_0.gguf
Safe
11.6 GB
LFS
uploaded model
about 2 months ago
granite-20b-functioncalling.Q4_1.gguf
Safe
12.8 GB
LFS
uploaded model
about 2 months ago
granite-20b-functioncalling.Q4_K.gguf
Safe
12.8 GB
LFS
uploaded model
about 2 months ago
granite-20b-functioncalling.Q4_K_M.gguf
Safe
12.8 GB
LFS
uploaded model
about 2 months ago
granite-20b-functioncalling.Q4_K_S.gguf
Safe
11.7 GB
LFS
uploaded model
about 2 months ago