torch==2.6.0 torchvision #flash-attn https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiTRUE-cp310-cp310-linux_x86_64.whl deepspeed accelerate transformers moviepy decord pillow opencv-python filetype natsort gradio