runtime error

Exit code: 1. Reason: libclang-18.1.1 markdown-3.7 ml-dtypes-0.4.0 namex-0.0.8 numpy-1.26.4 opt-einsum-3.3.0 optree-0.12.1 tensorboard-2.17.1 tensorboard-data-server-0.7.2 tensorflow-2.17.0 tensorflow-io-gcs-filesystem-0.37.1 termcolor-2.4.0 werkzeug-3.0.4 wrapt-1.16.0 [notice] A new release of pip available: 22.3.1 -> 24.2 [notice] To update, run: pip install --upgrade pip Collecting soundfile Downloading soundfile-0.12.1-py2.py3-none-manylinux_2_31_x86_64.whl (1.2 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.2/1.2 MB 39.9 MB/s eta 0:00:00 Requirement already satisfied: cffi>=1.0 in /usr/local/lib/python3.10/site-packages (from soundfile) (1.17.1) Requirement already satisfied: pycparser in /usr/local/lib/python3.10/site-packages (from cffi>=1.0->soundfile) (2.22) Installing collected packages: soundfile Successfully installed soundfile-0.12.1 [notice] A new release of pip available: 22.3.1 -> 24.2 [notice] To update, run: pip install --upgrade pip /usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py:1601: FutureWarning: `clean_up_tokenization_spaces` was not set. It will be set to `True` by default. This behavior will be depracted in transformers v4.45, and will be then set to `False` by default. For more details check this issue: https://github.com/huggingface/transformers/issues/31884 warnings.warn( Some weights of RobertaForSequenceClassification were not initialized from the model checkpoint at roberta-large and are newly initialized: ['classifier.dense.bias', 'classifier.dense.weight', 'classifier.out_proj.bias', 'classifier.out_proj.weight'] You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference. Traceback (most recent call last): File "/home/user/app/app.py", line 61, in <module> aspects_model.load_state_dict(torch.load('./Aspects_Extraction_Model_updated.pth', map_location=torch.device('cpu')), strict=False) NameError: name 'torch' is not defined

Container logs:

Fetching error logs...