runtime error
Exit code: 1. Reason: /2 [00:11<00:00, 5.93s/it] Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s][A Loading checkpoint shards: 100%|ββββββββββ| 2/2 [00:00<00:00, 3.76it/s] Using a slow image processor as `use_fast` is unset and a slow processor was saved with this model. `use_fast=True` will be the default behavior in v4.48, even if the model was saved with a slow processor. This will result in minor differences in outputs. You'll still be able to use a slow processor with `use_fast=False`. Downloading shards: 0%| | 0/4 [00:00<?, ?it/s][A Downloading shards: 25%|βββ | 1/4 [00:14<00:43, 14.38s/it][A Downloading shards: 50%|βββββ | 2/4 [00:27<00:27, 13.90s/it][A Downloading shards: 75%|ββββββββ | 3/4 [00:40<00:13, 13.50s/it][A Downloading shards: 100%|ββββββββββ| 4/4 [00:46<00:00, 10.19s/it][A Downloading shards: 100%|ββββββββββ| 4/4 [00:46<00:00, 11.52s/it] Loading checkpoint shards: 0%| | 0/4 [00:00<?, ?it/s][A Loading checkpoint shards: 100%|ββββββββββ| 4/4 [00:00<00:00, 4.60it/s] Some parameters are on the meta device because they were offloaded to the cpu and disk. Traceback (most recent call last): File "/home/user/app/app.py", line 4, in <module> demo = create_interface() File "/home/user/app/ui_components.py", line 23, in create_interface llm_node = LLMInferenceNode() File "/home/user/app/huggingface_inference_node.py", line 16, in __init__ self.huggingface_client = OpenAI( File "/usr/local/lib/python3.10/site-packages/openai/_client.py", line 123, in __init__ super().__init__( File "/usr/local/lib/python3.10/site-packages/openai/_base_client.py", line 846, in __init__ self._client = http_client or SyncHttpxClientWrapper( File "/usr/local/lib/python3.10/site-packages/openai/_base_client.py", line 744, in __init__ super().__init__(**kwargs) TypeError: Client.__init__() got an unexpected keyword argument 'proxies'
Container logs:
Fetching error logs...