---
base_model: facebook/llm-compiler-13b
extra_gated_button_content: I Accept Meta LLM Compiler License and AUP
extra_gated_description: The information you provide will be collected, stored, processed
and shared in accordance with the [Meta Privacy Policy](https://www.facebook.com/privacy/policy/).
extra_gated_fields:
Affiliation: text
? By clicking Submit below I accept the terms of the license and acknowledge that
the information I provide will be collected stored processed and shared in accordance
with the Meta Privacy Policy
: checkbox
Country: country
Date of birth: date_picker
First Name: text
I accept the terms and conditions: checkbox
Last Name: text
geo: ip_location
extra_gated_prompt: "**Meta Large Language Model Compiler (LLM Compiler) LICENSE AGREEMENT**\n\
Version Release Date: 27th June 2024\n\u201C**Agreement**\u201D means the terms\
\ and conditions for use, reproduction, distribution and modification of the LLM\
\ Compiler Materials set forth herein.\n\u201C**Documentation**\u201D means the\
\ specifications, manuals and documentation accompanying the LLM Compiler distributed\
\ by Meta at:\n* [https://huggingface.co./facebook/llm-compiler-7b](https://huggingface.co./facebook/llm-compiler-7b)\
\ * [https://huggingface.co./facebook/llm-compiler-7b-ftd](https://huggingface.co./facebook/llm-compiler-7b-ftd)\
\ * [https://huggingface.co./facebook/llm-compiler-13b](https://huggingface.co./facebook/llm-compiler-13b)\
\ * [https://huggingface.co./facebook/llm-compiler-13b-ftd](https://huggingface.co./facebook/llm-compiler-13b-ftd)\n\
\u201C**Licensee**\u201D or \u201C**you**\u201D means you, or your employer or any\
\ other person or entity (if you are entering into this Agreement on such person\
\ or entity\u2019s behalf), of the age required under applicable laws, rules or\
\ regulations to provide legal consent and that has legal authority to bind your\
\ employer or such other person or entity if you are entering in this Agreement\
\ on their behalf.\n\u201C**Meta Large Language Model Compiler\u201D and \u201C\
LLM Compiler**\u201D mean the foundational large language models and software and\
\ algorithms, including machine-learning model code, trained model weights, inference-enabling\
\ code, training-enabling code, fine-tuning enabling code and other elements of\
\ the foregoing distributed by Meta at:\n* [https://huggingface.co./facebook/llm-compiler-7b](https://huggingface.co./facebook/llm-compiler-7b)\
\ * [https://huggingface.co./facebook/llm-compiler-7b-ftd](https://huggingface.co./facebook/llm-compiler-7b-ftd)\
\ * [https://huggingface.co./facebook/llm-compiler-13b](https://huggingface.co./facebook/llm-compiler-13b)\
\ * [https://huggingface.co./facebook/llm-compiler-13b-ftd](https://huggingface.co./facebook/llm-compiler-13b-ftd)\n\
\u201C**LLM Compiler Materials**\u201D means, collectively, Meta\u2019s proprietary\
\ LLM Compiler and Documentation (and any portion thereof) made available under\
\ this Agreement.\n\u201C**Meta**\u201D or \u201C**we**\u201D means Meta Platforms\
\ Ireland Limited (if you are located in or, if you are an entity, your principal\
\ place of business is in the EEA or Switzerland) and Meta Platforms, Inc. (if you\
\ are located outside of the EEA or Switzerland). \nBy clicking \u201CI Accept\u201D\
\ below or by using or distributing any portion or element of the LLM Compiler Materials,\
\ you agree to be bound by this Agreement.\n1. **License Rights and Redistribution**.\
\ \\\n\n a. Grant of Rights.\
\ You are granted a non-exclusive, worldwide, non-transferable and royalty-free\
\ limited license under Meta\u2019s intellectual property or other rights owned\
\ by Meta embodied in the LLM Compiler Materials to use, reproduce, distribute,\
\ copy, create derivative works of, and make modifications to the LLM Compiler Materials.\
\ \n\n b. Redistribution and Use.\
\ \n\n i. If you distribute or make available the LLM Compiler Materials (or\
\ any derivative works thereof), or a product or service that uses any of them,\
\ including another AI model, you shall (A) provide a copy of this Agreement with\
\ any such LLM Compiler Materials; and (B) prominently display \u201CBuilt with\
\ LLM Compiler\u201D on a related website, user interface, blogpost, about page,\
\ or product documentation. If you use the LLM Compiler Materials to create, train,\
\ fine tune, or otherwise improve an AI model, which is distributed or made available,\
\ you shall also include \u201CLLM Compiler\u201D at the beginning of any such AI\
\ model name.\n\n ii. If you receive LLM Compiler Materials, or any derivative\
\ works thereof, from a Licensee as part of an integrated end user product, then\
\ Section 2 of this Agreement will not apply to you. \n\n iii. You must retain\
\ in all copies of the LLM Compiler Materials that you distribute the following\
\ attribution notice within a \u201CNotice\u201D text file distributed as a part\
\ of such copies: \u201CLLM Compiler is licensed under the LLM Compiler License,\
\ Copyright \xA9 Meta Platforms, Inc. All Rights Reserved.\u201D\n\n iv. Your\
\ use of the LLM Compiler Materials must comply with applicable laws and regulations\
\ (including trade compliance laws and regulations) and adhere to the Acceptable\
\ Use Policy for Llama Materials (available at https://llama.meta.com/llama3/use-policy),\
\ which is hereby incorporated by reference into this Agreement.\n\n v. You will\
\ not use the LLM Compiler Materials or any output or results of the LLM Compiler\
\ Materials to improve any other large language model. \n\n2. **Additional Commercial\
\ Terms**. If, on the LLM Compiler release date, the monthly active users of the\
\ products or services made available by or for Licensee, or Licensee\u2019s affiliates,\
\ is greater than 700 million monthly active users in the preceding calendar month,\
\ you must request a license from Meta, which Meta may grant to you in its sole\
\ discretion, and you are not authorized to exercise any of the rights under this\
\ Agreement unless or until Meta otherwise expressly grants you such rights. \n\
3**. Disclaimer of Warranty**. UNLESS REQUIRED BY APPLICABLE LAW, THE LLM COMPILER\
\ MATERIALS AND ANY OUTPUT AND RESULTS THEREFROM ARE PROVIDED ON AN \u201CAS IS\u201D\
\ BASIS, WITHOUT WARRANTIES OF ANY KIND, AND META DISCLAIMS ALL WARRANTIES OF ANY\
\ KIND, BOTH EXPRESS AND IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES\
\ OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE.\
\ YOU ARE SOLELY RESPONSIBLE FOR DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING\
\ THE LLM COMPILER MATERIALS AND ASSUME ANY RISKS ASSOCIATED WITH YOUR USE OF THE\
\ LLM COMPILER MATERIALS AND ANY OUTPUT AND RESULTS.\n4. **Limitation of Liability**.\
\ IN NO EVENT WILL META OR ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF LIABILITY,\
\ WHETHER IN CONTRACT, TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING\
\ OUT OF THIS AGREEMENT, FOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL,\
\ INCIDENTAL, EXEMPLARY OR PUNITIVE DAMAGES, EVEN IF META OR ITS AFFILIATES HAVE\
\ BEEN ADVISED OF THE POSSIBILITY OF ANY OF THE FOREGOING.\n5. **Intellectual Property**.\n\
\n a. No trademark licenses are granted under this Agreement, and in connection\
\ with the LLM Compiler Materials, neither Meta nor Licensee may use any name or\
\ mark owned by or associated with the other or any of its affiliates, except as\
\ required for reasonable and customary use in describing and redistributing the\
\ LLM Compiler Materials or as set forth in this Section 5(a). Meta hereby grants\
\ you a license to use LLM Compiler (the \u201CMark\u201D) solely as required to\
\ comply with the last sentence of Section 1.b.i. You will comply with Meta\u2019\
s brand guidelines (currently accessible at[ https://about.meta.com/brand/resources/meta/company-brand/)](https://about.meta.com/brand/resources/meta/company-brand/).\
\ All goodwill arising out of your use of the Mark will inure to the benefit of\
\ Meta. \n\n b. Subject to Meta\u2019s ownership of LLM Compiler Materials and\
\ derivatives made by or for Meta, with respect to any derivative works and modifications\
\ of the LLM Compiler Materials that are made by you, as between you and Meta, you\
\ are and will be the owner of such derivative works and modifications.\n\n c.\
\ If you institute litigation or other proceedings against Meta or any entity (including\
\ a cross-claim or counterclaim in a lawsuit) alleging that the LLM Compiler Materials\
\ or LLM Compiler outputs or results, or any portion of any of the foregoing, constitutes\
\ infringement of intellectual property or other rights owned or licensable by you,\
\ then any licenses granted to you under this Agreement shall terminate as of the\
\ date such litigation or claim is filed or instituted. You will indemnify and hold\
\ harmless Meta from and against any claim by any third party arising out of or\
\ related to your use or distribution of the LLM Compiler Materials.\n\n6. **Term\
\ and Termination**. The term of this Agreement will commence upon your acceptance\
\ of this Agreement or access to the LLM Compiler Materials and will continue in\
\ full force and effect until terminated in accordance with the terms and conditions\
\ herein. Meta may terminate this Agreement if you are in breach of any term or\
\ condition of this Agreement. Upon termination of this Agreement, you shall delete\
\ and cease use of the LLM Compiler Materials. Sections 3, 4 and 7 shall survive\
\ the termination of this Agreement. \n7. **Governing Law and Jurisdiction**. This\
\ Agreement will be governed and construed under the laws of the State of California\
\ without regard to choice of law principles, and the UN Convention on Contracts\
\ for the International Sale of Goods does not apply to this Agreement. The courts\
\ of California shall have exclusive jurisdiction of any dispute arising out of\
\ this Agreement. "
inference: false
library_name: gguf
license: other
pipeline_tag: text-generation
quantized_by: legraphista
tags:
- quantized
- GGUF
- quantization
- imat
- imatrix
- static
- 16bit
- 8bit
- 6bit
- 5bit
- 4bit
- 3bit
- 2bit
- 1bit
---
# llm-compiler-13b-IMat-GGUF
_Llama.cpp imatrix quantization of facebook/llm-compiler-13b_
Original Model: [facebook/llm-compiler-13b](https://huggingface.co./facebook/llm-compiler-13b)
Original dtype: `BF16` (`bfloat16`)
Quantized by: llama.cpp [b3256](https://github.com/ggerganov/llama.cpp/releases/tag/b3256)
IMatrix dataset: [here](https://gist.githubusercontent.com/bartowski1182/eb213dccb3571f863da82e99418f81e8/raw/b2869d80f5c16fd7082594248e80144677736635/calibration_datav3.txt)
- [Files](#files)
- [IMatrix](#imatrix)
- [Common Quants](#common-quants)
- [All Quants](#all-quants)
- [Downloading using huggingface-cli](#downloading-using-huggingface-cli)
- [Inference](#inference)
- [Llama.cpp](#llama-cpp)
- [FAQ](#faq)
- [Why is the IMatrix not applied everywhere?](#why-is-the-imatrix-not-applied-everywhere)
- [How do I merge a split GGUF?](#how-do-i-merge-a-split-gguf)
---
## Files
### IMatrix
Status: ✅ Available
Link: [here](https://huggingface.co./legraphista/llm-compiler-13b-IMat-GGUF/blob/main/imatrix.dat)
### Common Quants
| Filename | Quant type | File Size | Status | Uses IMatrix | Is Split |
| -------- | ---------- | --------- | ------ | ------------ | -------- |
| [llm-compiler-13b.Q8_0.gguf](https://huggingface.co./legraphista/llm-compiler-13b-IMat-GGUF/blob/main/llm-compiler-13b.Q8_0.gguf) | Q8_0 | 13.83GB | ✅ Available | ⚪ Static | 📦 No
| [llm-compiler-13b.Q6_K.gguf](https://huggingface.co./legraphista/llm-compiler-13b-IMat-GGUF/blob/main/llm-compiler-13b.Q6_K.gguf) | Q6_K | 10.68GB | ✅ Available | ⚪ Static | 📦 No
| [llm-compiler-13b.Q4_K.gguf](https://huggingface.co./legraphista/llm-compiler-13b-IMat-GGUF/blob/main/llm-compiler-13b.Q4_K.gguf) | Q4_K | 7.87GB | ✅ Available | 🟢 IMatrix | 📦 No
| [llm-compiler-13b.Q3_K.gguf](https://huggingface.co./legraphista/llm-compiler-13b-IMat-GGUF/blob/main/llm-compiler-13b.Q3_K.gguf) | Q3_K | 6.34GB | ✅ Available | 🟢 IMatrix | 📦 No
| [llm-compiler-13b.Q2_K.gguf](https://huggingface.co./legraphista/llm-compiler-13b-IMat-GGUF/blob/main/llm-compiler-13b.Q2_K.gguf) | Q2_K | 4.85GB | ✅ Available | 🟢 IMatrix | 📦 No
### All Quants
| Filename | Quant type | File Size | Status | Uses IMatrix | Is Split |
| -------- | ---------- | --------- | ------ | ------------ | -------- |
| [llm-compiler-13b.BF16.gguf](https://huggingface.co./legraphista/llm-compiler-13b-IMat-GGUF/blob/main/llm-compiler-13b.BF16.gguf) | BF16 | 26.03GB | ✅ Available | ⚪ Static | 📦 No
| [llm-compiler-13b.FP16.gguf](https://huggingface.co./legraphista/llm-compiler-13b-IMat-GGUF/blob/main/llm-compiler-13b.FP16.gguf) | F16 | 26.03GB | ✅ Available | ⚪ Static | 📦 No
| [llm-compiler-13b.Q8_0.gguf](https://huggingface.co./legraphista/llm-compiler-13b-IMat-GGUF/blob/main/llm-compiler-13b.Q8_0.gguf) | Q8_0 | 13.83GB | ✅ Available | ⚪ Static | 📦 No
| [llm-compiler-13b.Q6_K.gguf](https://huggingface.co./legraphista/llm-compiler-13b-IMat-GGUF/blob/main/llm-compiler-13b.Q6_K.gguf) | Q6_K | 10.68GB | ✅ Available | ⚪ Static | 📦 No
| [llm-compiler-13b.Q5_K.gguf](https://huggingface.co./legraphista/llm-compiler-13b-IMat-GGUF/blob/main/llm-compiler-13b.Q5_K.gguf) | Q5_K | 9.23GB | ✅ Available | ⚪ Static | 📦 No
| [llm-compiler-13b.Q5_K_S.gguf](https://huggingface.co./legraphista/llm-compiler-13b-IMat-GGUF/blob/main/llm-compiler-13b.Q5_K_S.gguf) | Q5_K_S | 8.97GB | ✅ Available | ⚪ Static | 📦 No
| [llm-compiler-13b.Q4_K.gguf](https://huggingface.co./legraphista/llm-compiler-13b-IMat-GGUF/blob/main/llm-compiler-13b.Q4_K.gguf) | Q4_K | 7.87GB | ✅ Available | 🟢 IMatrix | 📦 No
| [llm-compiler-13b.Q4_K_S.gguf](https://huggingface.co./legraphista/llm-compiler-13b-IMat-GGUF/blob/main/llm-compiler-13b.Q4_K_S.gguf) | Q4_K_S | 7.42GB | ✅ Available | 🟢 IMatrix | 📦 No
| llm-compiler-13b.IQ4_NL | IQ4_NL | - | ⏳ Processing | 🟢 IMatrix | -
| llm-compiler-13b.IQ4_XS | IQ4_XS | - | ⏳ Processing | 🟢 IMatrix | -
| [llm-compiler-13b.Q3_K.gguf](https://huggingface.co./legraphista/llm-compiler-13b-IMat-GGUF/blob/main/llm-compiler-13b.Q3_K.gguf) | Q3_K | 6.34GB | ✅ Available | 🟢 IMatrix | 📦 No
| llm-compiler-13b.Q3_K_L | Q3_K_L | - | ⏳ Processing | 🟢 IMatrix | -
| [llm-compiler-13b.Q3_K_S.gguf](https://huggingface.co./legraphista/llm-compiler-13b-IMat-GGUF/blob/main/llm-compiler-13b.Q3_K_S.gguf) | Q3_K_S | 5.66GB | ✅ Available | 🟢 IMatrix | 📦 No
| llm-compiler-13b.IQ3_M | IQ3_M | - | ⏳ Processing | 🟢 IMatrix | -
| llm-compiler-13b.IQ3_S | IQ3_S | - | ⏳ Processing | 🟢 IMatrix | -
| llm-compiler-13b.IQ3_XS | IQ3_XS | - | ⏳ Processing | 🟢 IMatrix | -
| llm-compiler-13b.IQ3_XXS | IQ3_XXS | - | ⏳ Processing | 🟢 IMatrix | -
| [llm-compiler-13b.Q2_K.gguf](https://huggingface.co./legraphista/llm-compiler-13b-IMat-GGUF/blob/main/llm-compiler-13b.Q2_K.gguf) | Q2_K | 4.85GB | ✅ Available | 🟢 IMatrix | 📦 No
| llm-compiler-13b.Q2_K_S | Q2_K_S | - | ⏳ Processing | 🟢 IMatrix | -
| llm-compiler-13b.IQ2_M | IQ2_M | - | ⏳ Processing | 🟢 IMatrix | -
| llm-compiler-13b.IQ2_S | IQ2_S | - | ⏳ Processing | 🟢 IMatrix | -
| llm-compiler-13b.IQ2_XS | IQ2_XS | - | ⏳ Processing | 🟢 IMatrix | -
| llm-compiler-13b.IQ2_XXS | IQ2_XXS | - | ⏳ Processing | 🟢 IMatrix | -
| llm-compiler-13b.IQ1_M | IQ1_M | - | ⏳ Processing | 🟢 IMatrix | -
| llm-compiler-13b.IQ1_S | IQ1_S | - | ⏳ Processing | 🟢 IMatrix | -
## Downloading using huggingface-cli
If you do not have hugginface-cli installed:
```
pip install -U "huggingface_hub[cli]"
```
Download the specific file you want:
```
huggingface-cli download legraphista/llm-compiler-13b-IMat-GGUF --include "llm-compiler-13b.Q8_0.gguf" --local-dir ./
```
If the model file is big, it has been split into multiple files. In order to download them all to a local folder, run:
```
huggingface-cli download legraphista/llm-compiler-13b-IMat-GGUF --include "llm-compiler-13b.Q8_0/*" --local-dir ./
# see FAQ for merging GGUF's
```
---
## Inference
### Llama.cpp
```
llama.cpp/main -m llm-compiler-13b.Q8_0.gguf --color -i -p "prompt here"
```
---
## FAQ
### Why is the IMatrix not applied everywhere?
According to [this investigation](https://www.reddit.com/r/LocalLLaMA/comments/1993iro/ggufs_quants_can_punch_above_their_weights_now/), it appears that lower quantizations are the only ones that benefit from the imatrix input (as per hellaswag results).
### How do I merge a split GGUF?
1. Make sure you have `gguf-split` available
- To get hold of `gguf-split`, navigate to https://github.com/ggerganov/llama.cpp/releases
- Download the appropriate zip for your system from the latest release
- Unzip the archive and you should be able to find `gguf-split`
2. Locate your GGUF chunks folder (ex: `llm-compiler-13b.Q8_0`)
3. Run `gguf-split --merge llm-compiler-13b.Q8_0/llm-compiler-13b.Q8_0-00001-of-XXXXX.gguf llm-compiler-13b.Q8_0.gguf`
- Make sure to point `gguf-split` to the first chunk of the split.
---
Got a suggestion? Ping me [@legraphista](https://x.com/legraphista)!