Vicuna-7B
LLaMA weights(llama-7b-hf) + Vicuna weights (vicuna-7b-delta-v1.1) = Vicuna-7B
How to
pip3 install fschat
pip3 install git+https://github.com/huggingface/transformers
sudo apt install git git-lfs
git clone https://huggingface.co./myaniu/Vicuna-7B
python3 -m fastchat.serve.cli --model-path /path/to/Vicuna-7B
- Downloads last month
- 12
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.