Error Loading the original model file consolidated.00.pth from local

#60
by chanduvkp - opened

Hello,

I have downloaded model "meta-llama/Meta-Llama-3.1-8B-Instruct" using huggingface-cli (huggingface-cli download meta-llama/Meta-Llama-3.1-8B-Instruct --include "original/*" --local-dir Meta-Llama-3.1-8B-Instruct). Once downloaded, i have below 3 files under "./Meta-Llama-3.1-8B-Instruct/original" directory

(.venv) ➜ original ll
total 31375544
-rw-r--r--@ 1 cvenigalla staff 15G Jul 29 07:54 consolidated.00.pth
-rw-r--r--@ 1 cvenigalla staff 199B Jul 29 07:20 params.json
-rw-r--r--@ 1 cvenigalla staff 2.1M Jul 29 07:20 tokenizer.model

I am loading the model from local with pipeline code below and i am getting below error. Your help is much more appreciated

Code: Below code has model path as "/User/xx/Meta-Llama-3.1-8B-Instruct/original/"
pipeline = transformers.pipeline(
"text-generation",
model="/User/xx/Meta-Llama-3.1-8B-Instruct/original/",
model_kwargs={"torch_dtype": torch.bfloat16},
device_map="auto",
)
Error:
OSError: /Users/xx/Meta-Llama-3.1-8B-Instruct/original/ does not appear to have a file named config.json. Checkout 'https://huggingface.co.//Users/xx/Meta-Llama-3.1-8B-Instruct/original//tree/None' for available files.

Code: In below code, i have changed the model path to relative "./Meta-Llama-3.1-8B-Instruct/original/"
pipeline = transformers.pipeline(
"text-generation",
model="./Meta-Llama-3.1-8B-Instruct/original/",
model_kwargs={"torch_dtype": torch.bfloat16},
device_map="auto",
)

Error:
OSError: Incorrect path_or_model_id: '.Meta-Llama-3.1-8B-Instruct/original/'. Please provide either the path to a local folder or the repo_id of a model on the Hub.

Best regards,
Chandu

chanduvkp changed discussion title from Loading the original model file consolidated.00.pth from local to Error Loading the original model file consolidated.00.pth from local

i have same problem. Is it solved?

Not yet. I have downloaded with out --include (huggingface-cli download meta-llama/Meta-Llama-3.1-8B-Instruct --local-dir Meta-Llama-3.1-8B-Instruct), and point to the directory where it has been downloaded "/Meta-Llama-3.1-8B-Instruct/" then i am getting different error "TypeError: MPS BFloat16 is only supported on MacOS 14 or newer"

This is in MacBookPro with M1 Max . Looks like ("torch_dtype": torch.bfloat16) is not supported for Mac silicon.
What would be right dtype for Silicon ?

Sign up or log in to comment