16-bit version of weights from PharMolix/BioMedGPT-LM-7B
, for easier download / finetuning / model-merging
Code
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
m2 = AutoModelForCausalLM.from_pretrained("PharMolix/BioMedGPT-LM-7B",
torch_dtype=torch.float16,
device_map="auto")
- Downloads last month
- 8
Inference API (serverless) is not available, repository is disabled.
Model tree for monsoon-nlp/BioMedGPT-16bit
Base model
PharMolix/BioMedGPT-LM-7B
Finetuned
this model