Meta-Llama-3-8B-Instruct does not appear to have a file named config.json
Downloaded the model using
huggingface-cli download meta-llama/Meta-Llama-3-8B-Instruct --include "original/*" --local-dir Meta-Llama-3-8B-Instruct
and am getting the below error:
Meta-Llama-3-8B-Instruct does not appear to have a file named config.json
What am i missing?
same problem
hi! i face the same issue, afterdownload the model, i got the same error, i suggest to look carefully to the name of the file , for same reason the file apperead on my machine like config(1).json , that "1" created some isseues, i delete the "1" and i solved the problem (pay attetion do not leave any -space- between the "config" and the "." , let me know...
hi! i face the same issue, afterdownload the model, i got the same error, i suggest to look carefully to the name of the file , for same reason the file apperead on my machine like config(1).json , that "1" created some isseues, i delete the "1" and i solved the problem (pay attetion do not leave any -space- between the "config" and the "." , let me know...
import transformers
import torch
model_path = "./" # replace with the actual path to the model directory
model_id = "Meta-Llama-3-8B-Instruct-Q4_K_M"
Load the model from the local path
model = transformers.AutoModelForCausalLM.from_pretrained(model_path)
Create the pipeline
pipeline = transformers.pipeline(
"text-generation", model=model, model_kwargs={"torch_dtype": torch.bfloat16}, device_map="auto"
)
Test the pipeline
output = pipeline("hi")
print(output)
give this error after get config
OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory ./.
i downlowa this model:
import transformers
import torch
model_id = "meta-llama/Meta-Llama-3-8B"
pipeline = transformers.pipeline(
"text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, device_map="auto"
)
pipeline("hi")
why carsh and not give response?
i run it on colab
https://huggingface.co./meta-llama/Meta-Llama-3-8B-Instruct/tree/main this is the right link, go there and then one by one dowload all files
hi! i face the same issue, afterdownload the model, i got the same error, i suggest to look carefully to the name of the file , for same reason the file apperead on my machine like config(1).json , that "1" created some isseues, i delete the "1" and i solved the problem (pay attetion do not leave any -space- between the "config" and the "." , let me know...
HI , I verified this is not the issue for me..Thanks
I am tooo facing the same issue...can anyone help me in resolving..tried everything..!
I am tooo facing the same issue...can anyone help me in resolving..tried everything..!
Hi .resolved
https://colab.research.google.com/drive/1BDglz53-s9Cs9rBR4SarJ0TTZeXnahdb
---------------------------------.profile
export HF_TOKEN=''
export HUGGING_FACE_HUB_TOKEN=''
#both write token (the same created after having access)
-------------------Shell
source ~/.profile
huggingface-cli login
huggingface-cli download meta-llama/Meta-Llama-3-8B-Instruct --include "original/*" --local-dir Meta-Llama-3-8B-Instruct
------------------------------- test.py
import torch
import transformers
import os
from huggingface_hub import login
HF_TOKEN=os.getenv('HF_TOKEN')
login(token=HF_TOKEN)
model_id = 'meta-llama/Meta-Llama-3-8B-Instruct'
pipeline = transformers.pipeline(
"text-generation", model=model_id, token=HF_TOKEN, model_kwargs={"torch_dtype": torch.bfloat16}, device_map="auto"
)
pipeline("Hey how are you doing today?")