Could someone please make a GGUF version of this model, please ? πŸ™πŸ‘

#1
by Dihelson - opened

Hello, friends, here is an interesting model to test in GGUF version, please !

@mradermacher
@bartowski

Thanks in advance for any interest.
πŸ™πŸ‘β€οΈ

You could have searched for it and found it :) https://huggingface.co./mradermacher/L3-Umbral-Mind-RP-v3.0-8B-GGUF

Imatrix quants will follow soon.

I appreciate your excitement to get GGUFs made for this model, but please don't bother the quant devs. They usually have specific places where they take model requests like mradermacher for example.

You could have searched for it and found it :) https://huggingface.co./mradermacher/L3-Umbral-Mind-RP-v3.0-8B-GGUF

Imatrix quants will follow soon.

Haha, thanks, I appreciate you always making quants of my models in a timely fashion, sorry about the @s

Thats very considerate :) I'm fine with receiving model requests by being @-mentioned (and I think bartowski is as well). My concern would, however, be to not bother the model creators on their repos with this, which is why I have the model_requests repo. But not everybody is the same, of course. I guess what I am trying to say is that huggingface is a pretty relaxed place.

Ahh, gotcha, okay. I'm still relatively new here so I was unaware, thanks for letting me know!

I'm sorry for bothering the devs with my enthusiasm for this model, and mark them too. I'm really new on HF, and learning a lot how things should be done. Yes, there's the dev request pages, I should have posted there, but yet, I'd like to thanks mradermacher for his kind answer and prompt deliver of the quantized model. I'm so grateful for that. Thank you guys very much. πŸ™πŸ‘β€οΈ

You haven't bothered anybody, Dihelson. You just trigegred a cascade of careful "uh, I hope nobody was annoyed" posts :) It's great to see somebody enjoy a model!

Sign up or log in to comment