Problem loading model

#1
by cnmoro - opened

from transformers import AutoTokenizer, LlamaForCausalLM
model_id = "tangledgroup/tangled-llama-33m-32k-instruct-v0.1"
model = LlamaForCausalLM.from_pretrained(model_id)

I get:

AttributeError Traceback (most recent call last)
File /mnt/e/tests/test.py:3
1 from transformers import AutoTokenizer, LlamaForCausalLM
2 model_id = "tangledgroup/tangled-llama-33m-32k-instruct-v0.1"
----> 3 model = LlamaForCausalLM.from_pretrained(model_id)

File ~/miniconda3/envs/llms/lib/python3.11/site-packages/transformers/modeling_utils.py:3738, in PreTrainedModel.from_pretrained(cls, pretrained_model_name_or_path, config, cache_dir, ignore_mismatched_sizes, force_download, local_files_only, token, revision, use_safetensors, *model_args, **kwargs)
3735 with safe_open(resolved_archive_file, framework="pt") as f:
3736 metadata = f.metadata()
-> 3738 if metadata.get("format") == "pt":
3739 pass
3740 elif metadata.get("format") == "tf":

AttributeError: 'NoneType' object has no attribute 'get'

any idea why? I tried transformers version 4.44.2 which is the same in the config.json file

well, I fixed it manually and uploaded it on

https://huggingface.co./cnmoro/tangled-llama-33m-32k-instruct-v0.1-fix

cnmoro changed discussion status to closed

Sign up or log in to comment