runtime error

The cache for model files in Transformers v4.22.0 has been updated. Migrating your old cache. This is a one-time only operation. You can interrupt this and resume the migration later on by calling `transformers.utils.move_cache()`. 0it [00:00, ?it/s] 0it [00:00, ?it/s] /home/user/.pyenv/versions/3.8.9/lib/python3.8/site-packages/diffusers/utils/outputs.py:63: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead. torch.utils._pytree._register_pytree_node( Traceback (most recent call last): File "app.py", line 2, in <module> from magic_mix import magic_mix File "/home/user/app/magic_mix.py", line 16, in <module> tokenizer = CLIPTokenizer.from_pretrained( File "/home/user/.pyenv/versions/3.8.9/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 2013, in from_pretrained raise EnvironmentError( OSError: Can't load tokenizer for 'openai/clip-vit-large-patch14'. If you were trying to load it from 'https://huggingface.co./models', make sure you don't have a local directory with the same name. Otherwise, make sure 'openai/clip-vit-large-patch14' is the correct path to a directory containing all relevant files for a CLIPTokenizer tokenizer.

Container logs:

Fetching error logs...