Edit model card
Configuration Parsing Warning: In config.json: "quantization_config.bits" must be an integer

fits into 24gb with 24576 ctx (q4)

set rope_alpha to 3.75

Downloads last month
33
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for waldie/gemma-2-27b-it-SimPO-37K-5.5bpw-h6-exl2

Base model

google/gemma-2-27b
Quantized
this model