Error when running in 4bit with bitsandbytes
#207
by
Lue-C
- opened
Hi there,
I am running the code from the model card to use the model in 4 bit. Beforhand I downloaded the model to the folder "model_path".
tokenizer = AutoTokenizer.from_pretrained(model_path, local_files_only=True)
model = AutoModelForCausalLM.from_pretrained(model_path, local_files_only=True, load_in_4bit=True)
but i get the following error:
The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers, 8-bit multiplication, and GPU quantization are unavailable.
The `load_in_4bit` and `load_in_8bit` arguments are deprecated and will be removed in the future versions. Please, pass a `BitsAndBytesConfig` object in
`quantization_config` argument instead.
Traceback (most recent call last):
File "C:\Users\mlu\Repositories\mixtral\mixtral_full.py", line 126, in <module>
model = AutoModelForCausalLM.from_pretrained(model_path, local_files_only=True, load_in_4bit=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mlu\Repositories\mixtral\.mixtral_venv\Lib\site-packages\transformers\models\auto\auto_factory.py", line 563, in from_pretrained
return model_class.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mlu\Repositories\mixtral\.mixtral_venv\Lib\site-packages\transformers\modeling_utils.py", line 3165, in from_pretrained
hf_quantizer.validate_environment(
File "C:\Users\mlu\Repositories\mixtral\.mixtral_venv\Lib\site-packages\transformers\quantizers\quantizer_bnb_4bit.py", line 62, in validate_environment
raise ImportError(
ImportError: Using `bitsandbytes` 8-bit quantization requires Accelerate: `pip install accelerate` and the latest version of bitsandbytes: `pip install -i https://pypi.org/simple/ bitsandbytes`
The versions I am using are:
accelerate==0.29.3
bitsandbytes==0.43.1
transformers==4.40.1
How can I get this running?
Regards
Lue-C
changed discussion status to
closed
Lue-C
changed discussion status to
open
Hi ybelkada,
yeah this was the problem. Thanks!
Lue-C
changed discussion status to
closed