Disable trust_remote_code
Hey,
Thanks for the awesome model. We would like to take it for a spin without trust_remote_code. I have download both the actual model that am trying (Alibaba-NLP/gte-large-en-v1.5) and modelling files from (Alibaba-NLP/new-impl) to the local. Is there a way to update the config in the gte-large model to avoid trust_remote_code=True as the network is blocking that call inside our company network.
Thanks.
Hi, you can move the modeling.py
and configuration.py
into the Alibaba-NLP/gte-large-en-v1.5
folder, and substitute the following lines in config.json
"auto_map": {
"AutoConfig": "Alibaba-NLP/new-impl--configuration.NewConfig",
"AutoModel": "Alibaba-NLP/new-impl--modeling.NewModel",
"AutoModelForMaskedLM": "Alibaba-NLP/new-impl--modeling.NewForMaskedLM",
"AutoModelForMultipleChoice": "Alibaba-NLP/new-impl--modeling.NewForMultipleChoice",
"AutoModelForQuestionAnswering": "Alibaba-NLP/new-impl--modeling.NewForQuestionAnswering",
"AutoModelForSequenceClassification": "Alibaba-NLP/new-impl--modeling.NewForSequenceClassification",
"AutoModelForTokenClassification": "Alibaba-NLP/new-impl--modeling.NewForTokenClassification"
},
with
"auto_map": {
"AutoConfig": "configuration.NewConfig",
"AutoModel": "modeling.NewModel",
"AutoModelForMaskedLM": "modeling.NewForMaskedLM",
"AutoModelForMultipleChoice": "modeling.NewForMultipleChoice",
"AutoModelForQuestionAnswering": "modeling.NewForQuestionAnswering",
"AutoModelForSequenceClassification": "modeling.NewForSequenceClassification",
"AutoModelForTokenClassification": "modeling.NewForTokenClassification"
},
Excuse me, I tried the method mentioned above and set 'trust_remote_code' to False, but I still couldn't execute the example code for sentence-transformers successfully. Is there any additional adjustment needed to avoid the ValueError?(I have downloaded the model located at './Alibaba-NLP/gte-large-en-v1.5'.)
Code:
# Requires sentence_transformers>=2.7.0
from sentence_transformers import SentenceTransformer
from sentence_transformers.util import cos_sim
sentences = ['That is a happy person', 'That is a very happy person']
model = SentenceTransformer('Alibaba-NLP/gte-large-en-v1.5', trust_remote_code=False)
embeddings = model.encode(sentences)
print(cos_sim(embeddings[0], embeddings[1]))
Error message:
Traceback (most recent call last):
File "/home/jiangxuehaokeai/test_embedding_model/test.py", line 8, in <module>
model = SentenceTransformer('Alibaba-NLP/gte-large-en-v1.5', trust_remote_code=False)
File "/home/jiangxuehaokeai/.local/lib/python3.10/site-packages/sentence_transformers/SentenceTransformer.py", line 197, in __init__
modules = self._load_sbert_model(
File "/home/jiangxuehaokeai/.local/lib/python3.10/site-packages/sentence_transformers/SentenceTransformer.py", line 1296, in _load_sbert_model
module = Transformer(model_name_or_path, cache_dir=cache_folder, **kwargs)
File "/home/jiangxuehaokeai/.local/lib/python3.10/site-packages/sentence_transformers/models/Transformer.py", line 35, in __init__
config = AutoConfig.from_pretrained(model_name_or_path, **model_args, cache_dir=cache_dir)
File "/home/jiangxuehaokeai/.local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1114, in from_pretrained
trust_remote_code = resolve_trust_remote_code(
File "/home/jiangxuehaokeai/.local/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 621, in resolve_trust_remote_code
raise ValueError(
ValueError: Loading Alibaba-NLP/gte-large-en-v1.5 requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the option `trust_remote_code=True` to remove this error.
@jiangxuehaokeai
you could try passing a local path to SentenceTransformer, eg model = SentenceTransformer('./Alibaba-NLP/gte-large-en-v1.5', trust_remote_code=False).
@jiangxuehaokeai you could try passing a local path to SentenceTransformer, eg
model = SentenceTransformer('./Alibaba-NLP/gte-large-en-v1.5', trust_remote_code=False).
Thank you for your prompt response, but unfortunately, the same issue still persists. QQ
We have the same issue. Please let me know, in case you found a solution
@jiangxuehaokeai
.
We would be very happy to use the model without the need to connect to huggingface that is as well not possible inside our network.
@jiangxuehaokeai
I didn't see you using trust_remote_code=False
. After the processing of Alibaba-NLP/new-impl/discussions/2#comment-2, it still requires setting trust_remote_code=True
.
I have re-executed these steps and successfully loaded the model.
ping
@phizdbc
@izhx Thank you for the reminder. After I reverted 'trust_remote_code' back to True, I encountered the following error:
File "/home/jiangxuehaokeai/.local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 112
0, in from_pretrained
config_class = get_class_from_dynamic_module(
File "/home/jiangxuehaokeai/.local/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 501, in get_class_from_dynamic_module
return get_class_in_module(class_name, final_module.replace(".py", ""))
File "/home/jiangxuehaokeai/.local/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 201, in get_class_in_module
module = importlib.import_module(module_path)
File "/usr/lib/python3.10/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 992, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 992, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1004, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'transformers_modules.gte-large-en-v1'
Then, following the code in the library, I checked and found that when handling the path for the model's name, the program was cutting off characters after the .
This caused the model's name to become transformers_modules.gte-large-en-v1
instead of transformers_modules.gte-large-en-v1.5
. Afterwards, I renamed the path for the model's name to ./Alibaba-NLP/gte-large-en-v1-5
, allowing me to use the model locally even when disconnected from the internet.
@phizdbc
You can check to see if you encounter the same error as I did.
It seems there is a tranformers package version thing here - my colleague run into the same error and another one followed: 'NewConfig' object has no attribute '_attn_implementation'
but upgrading to tranformers>=4.41.0
, solved both errors.
Thanks
@izhx
and
@jiangxuehaokeai
.
Hi, you can move the
modeling.py
andconfiguration.py
into theAlibaba-NLP/gte-large-en-v1.5
folder, and substitute the following lines inconfig.json
"auto_map": { "AutoConfig": "Alibaba-NLP/new-impl--configuration.NewConfig", "AutoModel": "Alibaba-NLP/new-impl--modeling.NewModel", "AutoModelForMaskedLM": "Alibaba-NLP/new-impl--modeling.NewForMaskedLM", "AutoModelForMultipleChoice": "Alibaba-NLP/new-impl--modeling.NewForMultipleChoice", "AutoModelForQuestionAnswering": "Alibaba-NLP/new-impl--modeling.NewForQuestionAnswering", "AutoModelForSequenceClassification": "Alibaba-NLP/new-impl--modeling.NewForSequenceClassification", "AutoModelForTokenClassification": "Alibaba-NLP/new-impl--modeling.NewForTokenClassification" },
with
"auto_map": { "AutoConfig": "configuration.NewConfig", "AutoModel": "modeling.NewModel", "AutoModelForMaskedLM": "modeling.NewForMaskedLM", "AutoModelForMultipleChoice": "modeling.NewForMultipleChoice", "AutoModelForQuestionAnswering": "modeling.NewForQuestionAnswering", "AutoModelForSequenceClassification": "modeling.NewForSequenceClassification", "AutoModelForTokenClassification": "modeling.NewForTokenClassification" },
how where we can get configuration.py and modeling.py , as unable to see in downloaded model file
how where we can get configuration.py and modeling.py , as unable to see in downloaded model file
@rohtashbeniwal555
In this repo.
https://huggingface.co./Alibaba-NLP/new-impl/blob/main/modeling.py
https://huggingface.co./Alibaba-NLP/new-impl/blob/main/configuration.py
I met some problems when I use new-impl
https://huggingface.co./Alibaba-NLP/gte-large-en-v1.5/discussions/17#66a3a719f33ff23e1c003ef1
need help~
any updates?