runtime error
Exit code: 1. Reason: following files was downloaded from https://huggingface.co./microsoft/Phi-3.5-vision-instruct: - configuration_phi3_v.py . Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision. A new version of the following files was downloaded from https://huggingface.co./microsoft/Phi-3.5-vision-instruct: - modeling_phi3_v.py . Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision. Downloading shards: 0%| | 0/2 [00:00<?, ?it/s][A Downloading shards: 50%|βββββ | 1/2 [00:11<00:11, 11.53s/it][A Downloading shards: 100%|ββββββββββ| 2/2 [00:20<00:00, 9.83s/it][A Downloading shards: 100%|ββββββββββ| 2/2 [00:20<00:00, 10.09s/it] Traceback (most recent call last): File "/home/user/app/app.py", line 8, in <module> "microsoft/Phi-3.5-vision-instruct": AutoModelForCausalLM.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 559, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3769, in from_pretrained config = cls._autoset_attn_implementation( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 1519, in _autoset_attn_implementation cls._check_and_enable_flash_attn_2( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 1621, in _check_and_enable_flash_attn_2 raise ImportError(f"{preface} the package flash_attn seems to be not installed. {install_message}") ImportError: FlashAttention2 has been toggled on, but it cannot be used due to the following error: the package flash_attn seems to be not installed. Please refer to the documentation of https://huggingface.co./docs/transformers/perf_infer_gpu_one#flashattention-2 to install Flash Attention 2.
Container logs:
Fetching error logs...