AutoModelForQuestionAnswering: ValueError: too many values to unpack (expected 2)
#2
by
BlueRey
- opened
When I initialized AutoModelForQuestionAnswering from pretrained indobenchmark/indobert-base-p1 it won’t produce any outputs and shows ValueError, below is the code:
from transformers import AutoModelForQuestionAnswering, AutoTokenizer
model = AutoModelForQuestionAnswering.from_pretrained('indobenchmark/indobert-base-p1')
tokenizer = AutoTokenizer.from_pretrained('indobenchmark/indobert-base-p1')
context = 'Sindrom Bazex: acrokeratosis paraneoplastica.'
question = 'Nama sinonim dari Acrokeratosis paraneoplastica.'
inputs = tokenizer(question, context, return_tensors='pt')
outputs = model(**inputs)
inputs, outputs
The error shown is:
ValueError Traceback (most recent call last) /tmp/ipykernel_28/2363476019.py in <module>
3
4 inputs = tokenizer(question, context, return_tensors='pt')
----> 5 outputs = model(**inputs)
6
7 inputs, outputs
/opt/conda/lib/python3.7/site-packages/torch/nn/modules/module.py in
_call_impl(self, *input, **kwargs) 1128 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks 1129 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1130 return forward_call(*input, **kwargs) 1131 # Do not call functions when jit is used 1132 full_backward_hooks, non_full_backward_hooks = [], []
/opt/conda/lib/python3.7/site-packages/transformers/models/bert/modeling_bert.py in forward(self, input_ids, attention_mask, token_type_ids, position_ids, head_mask, inputs_embeds, start_positions, end_positions, output_attentions, output_hidden_states, return_dict) 1860 1861 logits = self.qa_outputs(sequence_output)
-> 1862 start_logits, end_logits = logits.split(1, dim=-1) 1863 start_logits = start_logits.squeeze(-1).contiguous() 1864 end_logits = end_logits.squeeze(-1).contiguous()
ValueError: too many values to unpack (expected 2)
But when I initialize the model with from a different repo i.e indolem/indobert-base-uncased, it runs no problem. I only changed the from_pretrained value.
Has this ever happened to anyone? Is it a problem with the model I’m trying to load? Or is there a specific version of huggingface scripts that I need to use? Any help would be appreciated, thanks!