Issue Loading zpm/Llama-3.1-PersianQA Model – Missing pytorch_model.bin File

#1
by datalover23 - opened

Hello everyone,

I've been trying to use the zpm/Llama-3.1-PersianQA model for a question-answering task with the following code:

from transformers import pipeline

# Load the model
qa_pipeline = pipeline("question-answering", model="zpm/Llama-3.1-PersianQA")

# Example usage
context = "شرکت فولاد مبارکۀ اصفهان، بزرگ‌ترین واحد صنعتی خصوصی در ایران و بزرگ‌ترین مجتمع تولید فولاد در خاورمیانه است."
question = "شرکت فولاد مبارکه در کجا واقع شده است؟"

result = qa_pipeline(question=question, context=context)
print(result)

However, I encountered the following error:

OSError: zpm/Llama-3.1-PersianQA does not appear to have a file named pytorch_model.bin, model.safetensors, tf_model.h5, model.ckpt or flax_model.msgpack.

I checked the model repository, and it includes files such as adapter_model.safetensors, config.json, and some .gguf files, but not pytorch_model.bin or other full model files.

I'm wondering if this is an adapter model that needs to be loaded with a specific base model, or if there's another way to use the available adapter_model.safetensors file. I’d appreciate any guidance on how to properly load and use this model or if there’s a base model I should pair it with.

Thanks in advance for any help or suggestions!

Sign up or log in to comment