TheBloke commited on
Commit
d995f17
1 Parent(s): 4623c21

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -101,7 +101,7 @@ All recent GPTQ files are made with AutoGPTQ, and all files in non-main branches
101
 
102
  To download from the `main` branch, enter `TheBloke/Xwin-LM-7B-V0.1-GPTQ` in the "Download model" box.
103
 
104
- To download from another branch, add `:branchname` to the end of the download name, eg `TheBloke/Xwin-LM-7B-V0.1-GPTQ:main`
105
 
106
  ### From the command line
107
 
@@ -122,7 +122,7 @@ To download from a different branch, add the `--revision` parameter:
122
 
123
  ```shell
124
  mkdir Xwin-LM-7B-V0.1-GPTQ
125
- huggingface-cli download TheBloke/Xwin-LM-7B-V0.1-GPTQ --revision main --local-dir Xwin-LM-7B-V0.1-GPTQ --local-dir-use-symlinks False
126
  ```
127
 
128
  <details>
@@ -155,7 +155,7 @@ Windows Command Line users: You can set the environment variable by running `set
155
  To clone a specific branch with `git`, use a command like this:
156
 
157
  ```shell
158
- git clone --single-branch --branch main https://huggingface.co/TheBloke/Xwin-LM-7B-V0.1-GPTQ
159
  ```
160
 
161
  Note that using Git with HF repos is strongly discouraged. It will be much slower than using `huggingface-hub`, and will use twice as much disk space as it has to store the model files twice (it stores every byte both in the intended target folder, and again in the `.git` folder as a blob.)
@@ -170,7 +170,7 @@ It is strongly recommended to use the text-generation-webui one-click-installers
170
 
171
  1. Click the **Model tab**.
172
  2. Under **Download custom model or LoRA**, enter `TheBloke/Xwin-LM-7B-V0.1-GPTQ`.
173
- - To download from a specific branch, enter for example `TheBloke/Xwin-LM-7B-V0.1-GPTQ:main`
174
  - see Provided Files above for the list of branches for each option.
175
  3. Click **Download**.
176
  4. The model will start downloading. Once it's finished it will say "Done".
@@ -211,7 +211,7 @@ from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
211
 
212
  model_name_or_path = "TheBloke/Xwin-LM-7B-V0.1-GPTQ"
213
  # To use a different branch, change revision
214
- # For example: revision="main"
215
  model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
216
  device_map="auto",
217
  trust_remote_code=False,
 
101
 
102
  To download from the `main` branch, enter `TheBloke/Xwin-LM-7B-V0.1-GPTQ` in the "Download model" box.
103
 
104
+ To download from another branch, add `:branchname` to the end of the download name, eg `TheBloke/Xwin-LM-7B-V0.1-GPTQ:gptq-4bit-32g-actorder_True`
105
 
106
  ### From the command line
107
 
 
122
 
123
  ```shell
124
  mkdir Xwin-LM-7B-V0.1-GPTQ
125
+ huggingface-cli download TheBloke/Xwin-LM-7B-V0.1-GPTQ --revision gptq-4bit-32g-actorder_True --local-dir Xwin-LM-7B-V0.1-GPTQ --local-dir-use-symlinks False
126
  ```
127
 
128
  <details>
 
155
  To clone a specific branch with `git`, use a command like this:
156
 
157
  ```shell
158
+ git clone --single-branch --branch gptq-4bit-32g-actorder_True https://huggingface.co/TheBloke/Xwin-LM-7B-V0.1-GPTQ
159
  ```
160
 
161
  Note that using Git with HF repos is strongly discouraged. It will be much slower than using `huggingface-hub`, and will use twice as much disk space as it has to store the model files twice (it stores every byte both in the intended target folder, and again in the `.git` folder as a blob.)
 
170
 
171
  1. Click the **Model tab**.
172
  2. Under **Download custom model or LoRA**, enter `TheBloke/Xwin-LM-7B-V0.1-GPTQ`.
173
+ - To download from a specific branch, enter for example `TheBloke/Xwin-LM-7B-V0.1-GPTQ:gptq-4bit-32g-actorder_True`
174
  - see Provided Files above for the list of branches for each option.
175
  3. Click **Download**.
176
  4. The model will start downloading. Once it's finished it will say "Done".
 
211
 
212
  model_name_or_path = "TheBloke/Xwin-LM-7B-V0.1-GPTQ"
213
  # To use a different branch, change revision
214
+ # For example: revision="gptq-4bit-32g-actorder_True"
215
  model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
216
  device_map="auto",
217
  trust_remote_code=False,