Add conversational tag
1
#11 opened 4 months ago
by
celinah
![](https://cdn-avatars.huggingface.co/v1/production/uploads/6192895f3b8aa351a46fadfd/b_MkVyuYPgM_WOEKjpNdd.jpeg)
llama.cpp doesn't support this model, how can I convert safetensors model to bin and load in ollama
#10 opened 5 months ago
by
shuminzhou26803586
Update chat_template.json to incorporate `generation` tag
#9 opened 6 months ago
by
zjysteven
![](https://cdn-avatars.huggingface.co/v1/production/uploads/63f7773ed36951307fd02d18/l8KmKNYvBXbUDHaH9BWL6.jpeg)
RuntimeError: Could not infer dtype of numpy.float32 when converting to PyTorch tensor
1
#8 opened 6 months ago
by
Koshti10
shape mismatch error during inference with finetuned Model
6
#7 opened 8 months ago
by
mdmev
Why no chat template like non-chatty has?
#5 opened 9 months ago
by
pseudotensor
![](https://cdn-avatars.huggingface.co/v1/production/uploads/6308791ac038bf42d568153f/z9TovAddXU3OQR9N_2KFP.jpeg)
How to merge an adapter to the base model
1
#4 opened 9 months ago
by
alielfilali01
![](https://cdn-avatars.huggingface.co/v1/production/uploads/626237d9bbcbd1c34f1bb231/EJrOjvAL-68qMCYdnvOrq.png)
How to deploy on inference endpoints?
2
#3 opened 9 months ago
by
brianjking
Update README.md
#2 opened 9 months ago
by
Alexander70
[Question] question about hyperparameter
1
#1 opened 9 months ago
by
Lala-chick