Can't use with transformers
#3
by
LeMoussel
- opened
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Wissam42/sentence-croissant-llm-base")
model = AutoModel.from_pretrained("Wissam42/sentence-croissant-llm-base")
Got this error
Exception has occurred: RuntimeError (note: full exception trace is shown but execution is paused at: _run_module_as_main)
Failed to import transformers.models.llama.modeling_llama because of the following error (look up to see its traceback):
/home/dev/.local/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN2at4_ops5zeros4callEN3c108ArrayRefINS2_6SymIntEEENS2_8optionalINS2_10ScalarTypeEEENS6_INS2_6LayoutEEENS6_INS2_6DeviceEEENS6_IbEE
Use with transformers V4.39.1