ValueError: ## Could not find model.layers.0.mlp.down_proj.* in model

#1
by MrHillsss - opened

Getting an error while loading

Hi, I am experiencing similiar error while loading a model. ValueError: Unrecognized layer: model.layers.0.block_sparse_moe.experts.0.w1.bias

Same here, tried 32, 128 etc, same error.

hmm i played with loading it with autogptq and it said it loaded on my 24gb vram but for first response got this error
RuntimeError: cannot reshape tensor of 0 elements into shape [-1, 1, 0] because the unspecified dimension size -1 can be any value and is ambiguous

dunno what im doin wrong

Traceback (most recent call last):
File "D:\booga\text-generation-webui\modules\callbacks.py", line 57, in gentask
ret = self.mfunc(callback=_callback, *args, **self.kwargs)
File "D:\booga\text-generation-webui\modules\text_generation.py", line 351, in generate_with_callback
shared.model.generate(**kwargs)
File "D:\booga\text-generation-webui\installer_files\env\lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "D:\booga\text-generation-webui\installer_files\env\lib\site-packages\transformers\generation\utils.py", line 1764, in generate
return self.sample(
File "D:\booga\text-generation-webui\installer_files\env\lib\site-packages\transformers\generation\utils.py", line 2861, in sample
outputs = self(
File "D:\booga\text-generation-webui\installer_files\env\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "D:\booga\text-generation-webui\installer_files\env\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "D:\booga\text-generation-webui\installer_files\env\lib\site-packages\transformers\models\mixtral\modeling_mixtral.py", line 1213, in forward
outputs = self.model(
File "D:\booga\text-generation-webui\installer_files\env\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "D:\booga\text-generation-webui\installer_files\env\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "D:\booga\text-generation-webui\installer_files\env\lib\site-packages\transformers\models\mixtral\modeling_mixtral.py", line 1081, in forward
layer_outputs = decoder_layer(
File "D:\booga\text-generation-webui\installer_files\env\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "D:\booga\text-generation-webui\installer_files\env\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "D:\booga\text-generation-webui\installer_files\env\lib\site-packages\transformers\models\mixtral\modeling_mixtral.py", line 810, in forward
hidden_states, router_logits = self.block_sparse_moe(hidden_states)
File "D:\booga\text-generation-webui\installer_files\env\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "D:\booga\text-generation-webui\installer_files\env\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "D:\booga\text-generation-webui\installer_files\env\lib\site-packages\transformers\models\mixtral\modeling_mixtral.py", line 708, in forward
router_logits = self.gate(hidden_states)
File "D:\booga\text-generation-webui\installer_files\env\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "D:\booga\text-generation-webui\installer_files\env\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "D:\booga\text-generation-webui\installer_files\env\lib\site-packages\auto_gptq\nn_modules\qlinear\qlinear_cuda_old.py", line 239, in forward
zeros = zeros.reshape(-1, 1, zeros.shape[1] * zeros.shape[2])
RuntimeError: cannot reshape tensor of 0 elements into shape [-1, 1, 0] because the unspecified dimension size -1 can be any value and is ambiguous

Sign up or log in to comment