how to solve this error

#64
by vinayakarsh - opened

Unsupported: call_method UserDefinedObjectVariable(Params4bit) t [] {}

from user code:
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/deprecation.py", line 172, in wrapped_func
return func(*args, **kwargs)
File "/usr/local/lib/python3.11/dist-packages/transformers/models/gemma2/modeling_gemma2.py", line 887, in forward
outputs = self.model(
File "/usr/local/lib/python3.11/dist-packages/transformers/models/gemma2/modeling_gemma2.py", line 667, in forward
layer_outputs = decoder_layer(
File "/usr/local/lib/python3.11/dist-packages/transformers/models/gemma2/modeling_gemma2.py", line 321, in forward
hidden_states, self_attn_weights = self.self_attn(
File "/usr/local/lib/python3.11/dist-packages/transformers/models/gemma2/modeling_gemma2.py", line 216, in forward
query_states = self.q_proj(hidden_states).view(hidden_shape).transpose(1, 2)
File "/usr/local/lib/python3.11/dist-packages/bitsandbytes/nn/modules.py", line 484, in forward
return bnb.matmul_4bit(x, self.weight.t(), bias=bias, quant_state=self.weight.quant_state).to(inp_dtype)

Set TORCH_LOGS="+dynamo" and TORCHDYNAMO_VERBOSE=1 for more information

i'm using bitsandbytes quantisation:
quantization_config = BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_compute_dtype=torch.float16
)

Google org

Hi @vinayakarsh ,

Getting an error because TorchDynamo is trying to optimize the computation graph, but BitsAndBytes 4-bit quantized layers are not fully supported by TorchDynamo. To solve this error, to resolve this error, please disable TorchDynamo using the following code.

  import torch._dynamo
  torch._dynamo.config.suppress_errors = True
  torch._dynamo.disable()

The code was successfully executed in Google Colab with a T4 GPU runtime. You can check the details in the provided gist file, where I have also listed the library versions used.

Thank you.

thanks for the help... I tried disabling TorchDynamo using the code above and it returns the below error:
"""
Unsupported: call_method UserDefinedObjectVariable(Params4bit) t [] {}

from user code:
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/deprecation.py", line 172, in wrapped_func
return func(*args, **kwargs)
File "/usr/local/lib/python3.11/dist-packages/transformers/models/gemma2/modeling_gemma2.py", line 887, in forward
outputs = self.model(
File "/usr/local/lib/python3.11/dist-packages/transformers/models/gemma2/modeling_gemma2.py", line 667, in forward
layer_outputs = decoder_layer(
File "/usr/local/lib/python3.11/dist-packages/transformers/models/gemma2/modeling_gemma2.py", line 321, in forward
hidden_states, self_attn_weights = self.self_attn(
File "/usr/local/lib/python3.11/dist-packages/transformers/models/gemma2/modeling_gemma2.py", line 216, in forward
query_states = self.q_proj(hidden_states).view(hidden_shape).transpose(1, 2)
File "/usr/local/lib/python3.11/dist-packages/bitsandbytes/nn/modules.py", line 484, in forward
return bnb.matmul_4bit(x, self.weight.t(), bias=bias, quant_state=self.weight.quant_state).to(inp_dtype)

Set TORCH_LOGS="+dynamo" and TORCHDYNAMO_VERBOSE=1 for more information
"""

I had set these using the below code:
"""
import os
os.environ["TORCH_LOGS"] = "+dynamo"
os.environ["TORCHDYNAMO_VERBOSE"] = "1"
"""

still return the same error...

I face the same problem......So how to solve it.

apt install gcc
write "export CC=/usr/bin/gcc " to .bashrc

Sign up or log in to comment