Spaces:
Runtime error
Runtime error
lixin4ever
commited on
Commit
•
8d5955c
1
Parent(s):
9a3865f
Update app.py
Browse files
app.py
CHANGED
@@ -15,7 +15,7 @@ DESCRIPTION = """\
|
|
15 |
|
16 |
This Space demonstrates model [CLEX-7B-Chat-16K](https://huggingface.co/DAMO-NLP-SG/CLEX-7B-Chat-16K), a Llama-2-7B model fine-tuned using our [CLEX](https://arxiv.org/abs/2310.16450) method. Feel free to play with it, or duplicate to run generations without a queue! If you want to run your own service, you can also [deploy the model on Inference Endpoints](https://huggingface.co/inference-endpoints).
|
17 |
|
18 |
-
The web demo supports the maximum input sequence length of 10k now (
|
19 |
|
20 |
This support of PDF input is tentative.
|
21 |
|
|
|
15 |
|
16 |
This Space demonstrates model [CLEX-7B-Chat-16K](https://huggingface.co/DAMO-NLP-SG/CLEX-7B-Chat-16K), a Llama-2-7B model fine-tuned using our [CLEX](https://arxiv.org/abs/2310.16450) method. Feel free to play with it, or duplicate to run generations without a queue! If you want to run your own service, you can also [deploy the model on Inference Endpoints](https://huggingface.co/inference-endpoints).
|
17 |
|
18 |
+
The web demo supports the maximum input sequence length of 10k now due to the limit of GPU memory, running the demo locally (with larger GPU memory) is highly recommended.
|
19 |
|
20 |
This support of PDF input is tentative.
|
21 |
|