Update README.md
Browse files
README.md
CHANGED
@@ -13,7 +13,7 @@ tags:
|
|
13 |
#### Description
|
14 |
|
15 |
Optimize your engagement with [This project](https://huggingface.co/OEvortex/HelpingAI-Lite) by seamlessly integrating GGUF Format model files.
|
16 |
-
|
17 |
### GGUF Technical Specifications
|
18 |
|
19 |
Delve into the intricacies of GGUF, a meticulously crafted format that builds upon the robust foundation of the GGJT model. Tailored for heightened extensibility and user-centric functionality, GGUF introduces a suite of indispensable features:
|
@@ -30,8 +30,23 @@ Delve into the intricacies of GGUF, a meticulously crafted format that builds up
|
|
30 |
|
31 |
The differentiator between GGJT and GGUF lies in the deliberate adoption of a key-value structure for hyperparameters (now termed metadata). Bid farewell to untyped lists, and embrace a structured approach that seamlessly accommodates new metadata without compromising compatibility with existing models. Augment your model with supplementary information for enhanced inference and model identification.
|
32 |
|
33 |
-
### Quantization
|
34 |
|
35 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
36 |
|
37 |
-
QUANTIZATION = q2_k, q3_k_l, q3_k_m, q3_k_s, q4_0, q4_1, q4_k_m, q4_k_s, q5_0, q5_1, q5_k_m, q5_k_s, q6_k, q8_0
|
|
|
13 |
#### Description
|
14 |
|
15 |
Optimize your engagement with [This project](https://huggingface.co/OEvortex/HelpingAI-Lite) by seamlessly integrating GGUF Format model files.
|
16 |
+
Please Subscribe to my youtube channel [OEvortex](https://youtube.com/@OEvortex)
|
17 |
### GGUF Technical Specifications
|
18 |
|
19 |
Delve into the intricacies of GGUF, a meticulously crafted format that builds upon the robust foundation of the GGJT model. Tailored for heightened extensibility and user-centric functionality, GGUF introduces a suite of indispensable features:
|
|
|
30 |
|
31 |
The differentiator between GGJT and GGUF lies in the deliberate adoption of a key-value structure for hyperparameters (now termed metadata). Bid farewell to untyped lists, and embrace a structured approach that seamlessly accommodates new metadata without compromising compatibility with existing models. Augment your model with supplementary information for enhanced inference and model identification.
|
32 |
|
|
|
33 |
|
34 |
+
**QUANTIZATION_METHODS:**
|
35 |
+
|
36 |
+
| Method | Quantization | Advantages | Trade-offs |
|
37 |
+
|---|---|---|---|
|
38 |
+
| q2_k | 2-bit integers | Significant model size reduction | Minimal impact on accuracy |
|
39 |
+
| q3_k_l | 3-bit integers | Balance between model size reduction and accuracy preservation | Moderate impact on accuracy |
|
40 |
+
| q3_k_m | 3-bit integers | Enhanced accuracy with mixed precision | Increased computational complexity |
|
41 |
+
| q3_k_s | 3-bit integers | Improved model efficiency with structured pruning | Reduced accuracy |
|
42 |
+
| q4_0 | 4-bit integers | Significant model size reduction | Moderate impact on accuracy |
|
43 |
+
| q4_1 | 4-bit integers | Enhanced accuracy with mixed precision | Increased computational complexity |
|
44 |
+
| q4_k_m | 4-bit integers | Optimized model size and accuracy with mixed precision and structured pruning | Reduced accuracy |
|
45 |
+
| q4_k_s | 4-bit integers | Improved model efficiency with structured pruning | Reduced accuracy |
|
46 |
+
| q5_0 | 5-bit integers | Balance between model size reduction and accuracy preservation | Moderate impact on accuracy |
|
47 |
+
| q5_1 | 5-bit integers | Enhanced accuracy with mixed precision | Increased computational complexity |
|
48 |
+
| q5_k_m | 5-bit integers | Optimized model size and accuracy with mixed precision and structured pruning | Reduced accuracy |
|
49 |
+
| q5_k_s | 5-bit integers | Improved model efficiency with structured pruning | Reduced accuracy |
|
50 |
+
| q6_k | 6-bit integers | Balance between model size reduction and accuracy preservation | Moderate impact on accuracy |
|
51 |
+
| q8_0 | 8-bit integers | Significant model size reduction | Minimal impact on accuracy |
|
52 |
|
|