Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
smpanaro
/
pythia-160m-AutoGPTQ-4bit-128g
like
0
Text Generation
Transformers
wikitext
gpt_neox
Inference Endpoints
4-bit precision
gptq
License:
mit
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
c2667aa
pythia-160m-AutoGPTQ-4bit-128g
Commit History
Create README.md
c2667aa
verified
smpanaro
commited on
Mar 12
Upload of AutoGPTQ quantized model
86878a7
verified
smpanaro
commited on
Mar 12
initial commit
2ebace7
verified
smpanaro
commited on
Mar 12