runtime error
Exit code: 1. Reason: , n_embd_v_gqa = 1024 llama_kv_cache_init: layer 30: n_embd_k_gqa = 1024, n_embd_v_gqa = 1024 llama_kv_cache_init: layer 31: n_embd_k_gqa = 1024, n_embd_v_gqa = 1024 llama_kv_cache_init: CPU KV buffer size = 64.00 MiB llama_init_from_model: KV self size = 64.00 MiB, K (f16): 32.00 MiB, V (f16): 32.00 MiB llama_init_from_model: CPU output buffer size = 0.12 MiB llama_init_from_model: CPU compute buffer size = 81.01 MiB llama_init_from_model: graph nodes = 1030 llama_init_from_model: graph splits = 1 CPU : SSE3 = 1 | SSSE3 = 1 | AVX = 1 | AVX2 = 1 | F16C = 1 | FMA = 1 | AVX512 = 1 | AVX512_VBMI = 1 | AVX512_VNNI = 1 | LLAMAFILE = 1 | OPENMP = 1 | AARCH64_REPACK = 1 | Model metadata: {'tokenizer.ggml.padding_token_id': '2', 'tokenizer.ggml.unknown_token_id': '0', 'tokenizer.ggml.eos_token_id': '2', 'general.architecture': 'llama', 'llama.rope.freq_base': '10000.000000', 'llama.context_length': '32768', 'general.name': 'huggingfaceh4_zephyr-7b-beta', 'llama.embedding_length': '4096', 'llama.feed_forward_length': '14336', 'llama.attention.layer_norm_rms_epsilon': '0.000010', 'llama.rope.dimension_count': '128', 'tokenizer.ggml.bos_token_id': '1', 'llama.attention.head_count': '32', 'llama.block_count': '32', 'llama.attention.head_count_kv': '8', 'general.quantization_version': '2', 'tokenizer.ggml.model': 'llama', 'general.file_type': '15'} Using fallback chat format: llama-2 Traceback (most recent call last): File "/home/user/app/app.py", line 130, in <module> main() File "/home/user/app/app.py", line 122, in main initialize_settings(model_path) # Mengirimkan model_path ke fungsi initialize_settings File "/home/user/app/app.py", line 52, in initialize_settings Settings.llm = Llama( File "/usr/local/lib/python3.10/site-packages/llama_index/core/settings.py", line 46, in llm self._llm = resolve_llm(llm) File "/usr/local/lib/python3.10/site-packages/llama_index/core/llms/utils.py", line 102, in resolve_llm assert isinstance(llm, LLM) AssertionError
Container logs:
Fetching error logs...