MohamedRashad commited on
Commit
0eb73fc
·
1 Parent(s): c63aa42

Update requirements to remove extra index URL and modify flash-attn installation

Browse files
Files changed (1) hide show
  1. requirements.txt +1 -3
requirements.txt CHANGED
@@ -1,5 +1,3 @@
1
- --extra-index-url https://download.pytorch.org/whl/cu121
2
-
3
  accelerate
4
  sentencepiece
5
  diffusers
@@ -24,6 +22,6 @@ xformers==0.0.27.post2
24
  spconv-cu120
25
  transformers
26
  gradio_litmodel3d
27
- https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.0.post2/flash_attn-2.7.0.post2+cu12torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
28
  https://huggingface.co/spaces/JeffreyXiang/TRELLIS/resolve/main/wheels/diff_gaussian_rasterization-0.0.0-cp310-cp310-linux_x86_64.whl?download=true
29
  https://huggingface.co/spaces/JeffreyXiang/TRELLIS/resolve/main/wheels/nvdiffrast-0.3.3-cp310-cp310-linux_x86_64.whl?download=true
 
 
 
1
  accelerate
2
  sentencepiece
3
  diffusers
 
22
  spconv-cu120
23
  transformers
24
  gradio_litmodel3d
25
+ flash-attn --no-build-isolation
26
  https://huggingface.co/spaces/JeffreyXiang/TRELLIS/resolve/main/wheels/diff_gaussian_rasterization-0.0.0-cp310-cp310-linux_x86_64.whl?download=true
27
  https://huggingface.co/spaces/JeffreyXiang/TRELLIS/resolve/main/wheels/nvdiffrast-0.3.3-cp310-cp310-linux_x86_64.whl?download=true