Change max_position_embeddings to 512
#21
by
vkehfdl1
- opened
When embedding in CUDA-enable environment, the Device side Assertion error occur when I try to put more than 512 embedding dimension. It looks like the config.json is wrong, the real max_position_embeddings are 512.
I think you might be right, 512 seems more reasonable. However, you should try and limit the sequence length to 384 as defined here: https://huggingface.co./sentence-transformers/all-mpnet-base-v2/blob/main/sentence_bert_config.json#L2
This is what the model was trained for, and what it should perform best with.