Hi, When I load this model by
model, tokenizer = FastLanguageModel.from_pretrained( model_name = config.checkpoint,load_in_4bit = load_in_4bit, max_seq_length = config.max_length, dtype = dtype, )
Error happened
· Sign up or log in to comment