Text Generation
Transformers
English
alpaca
bloom
LLM

Getting size mismatch error while loading the peft modal

#7
by deepak-banka - opened

RuntimeError: Error(s) in loading state_dict for PeftModelForCausalLM:
size mismatch for base_model.model.transformer.h.0.self_attention.query_key_value.lora_A.default.weight: copying a param with shape torch.Size([32, 4096]) from checkpoint, the shape in current model is torch.Size([16, 4096]).
size mismatch for base_model.model.transformer.h.0.self_attention.query_key_value.lora_B.default.weight: copying a param with shape torch.Size([8192, 16, 1]) from checkpoint, the shape in current model is torch.Size([12288, 16]).
size mismatch for base_model.model.transformer.h.1.self_attention.query_key_value.lora_A.default.weight: copying a param with shape torch.Size([32, 4096]) from checkpoint, the shape in current model is torch.Size([16, 4096]).
size mismatch for base_model.model.transformer.h.1.self_attention.query_key_value.lora_B.default.weight: copying a param with shape torch.Size([8192, 16, 1]) from checkpoint, the shape in current model is torch.Size([12288, 16]).
size mismatch for base_model.model.transformer.h.2.self_attention.query_key_value.lora_A.default.weight: copying a param

deepak-banka changed discussion status to closed

Sign up or log in to comment