To solve the warning from sentence_transformers, add 4 configs files.

#3
by pe65374 - opened

4 configs file. config.json under 1_Pooling directory as described in modules.json.
With putting those 4 files under the sbert-base-chinese-nli/ , the Warning "WARNING - No sentence-transformers model found with name /Users/peter42/.cache/torch/sentence_transformers/uer_sbert-base-chinese-nli. Creating a new one with MEAN pooling." would be solved.
The reason of Warning is described in the source code of https://github.com/UKPLab/sentence-transformers

If modules.json exsits, standard sentence_transformers load function is called. Or Auto model/Tokenize as huggingface model with mean pooling will be called.

i try your method, but i do not successfully load the model, because i get an new error: RuntimeError: Error(s) in loading state_dict for BertModel:
size mismatch for embeddings.word_embeddings.weight: copying a param with shape torch.Size([21128, 768]) from checkpoint, the shape in current model is torch.Size([30522, 768]).
You may consider adding ignore_mismatched_sizes=True in the model from_pretrained method.

what i do:
1、i add three json files under the sbert-base-chinese-nli/, the content i added came from pasting your code in the file changed.
2、i change the content in the config.json, replaceing 22 line code into 6 line that i paste in files changed too.

is there any wrong? thx.

i try your method, but i do not successfully load the model, because i get an new error: RuntimeError: Error(s) in loading state_dict for BertModel:
size mismatch for embeddings.word_embeddings.weight: copying a param with shape torch.Size([21128, 768]) from checkpoint, the shape in current model is torch.Size([30522, 768]).
You may consider adding ignore_mismatched_sizes=True in the model from_pretrained method.

what i do:
1、i add three json files under the sbert-base-chinese-nli/, the content i added came from pasting your code in the file changed.
2、i change the content in the config.json, replaceing 22 line code into 6 line that i paste in files changed too.

is there any wrong? thx.

It turns out you need to keep the original config.json in the home directory and add another 1_Pooling directory and have the changed config.json in that directory. Take a look at this example: e.g. https://huggingface.co./sentence-transformers/multi-qa-distilbert-cos-v1/tree/main

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment