A big request

#1
by BoscoTheDog - opened

I totally dig your work!

I'm currently implementing multiple languages in a completely browser-based non-profit project. It's based on Wllama, so inference can happen fully client-size, guaranteeing privacy.

One limitation of Wllama and similar tools based on WASM, is that there is a 4GB memory limit. That's why I was wondering if you'd be interested in generating smaller AI models, for example based on Phi 3 Mini. It runs really well in the browser (and was essentially designed for it), while still being very capable.

So far I've found various language finetunes for Phi 3 mini:

But for the other (European) languages there aren't many good small options. Sometimes there is Tiny Llama (Polish, German,Greek), but that model isn't very good. Sometime there are Gemma 2B versions (German, Sauerkraut), but it's also not great. For Ukranian and HUngarian I've had to resort to a very large model, which leaves very little room for context.

So I was of course wondering: perhaps you would be willing to aid with filling in the gaps by creating translation finetunes of Phi 3 Mini?

Owner

I am in the process of making Falcon-5.5B-Dutch, will post an update when it is finished. It will be a pre-trained model and also later on a fine-finetuned version. Quantized ~2.5GB which gives room for 1.5GB of ctx / attention.

Sign up or log in to comment