Great work on MoE!
Your work on MoE is absolutely wonderful!
It appears that you have leveraged the solar 10.7B model, which is fantastic!
If this is true, we should definitely promote your amazing work! :D
Hi, Thank you for your interest for my work!
Unfortunately, Solar 10.7B has not been used in this model.
but i am going to use Solar 10.7B in my next work or so.
Since Solar is now on llama architecture it was hard to merge with other mistral based models.
μλ
νμΈμ. μμ
μ κ΄μ¬ κ°μ Έμ£Όμ
μ κ°μ¬ν©λλ€.
μνκΉκ²λ, μ
μ€ν
μ΄μ§ μλΌ λͺ¨λΈμ μ΄λ² λͺ¨λΈμ μ¬μ©λμ§ μμμ΅λλ€ γ
γ
λ¬Όλ‘ λ€μ λͺ¨λΈμ νλ₯ν μ±λ₯μ μ§λ μλΌ λͺ¨λΈμ μ¬μ©ν κ³νμ μμ΅λλ€.
μλΌ λͺ¨λΈμ΄ νμ¬ λΌλ§ μν€ν
μ³λ‘λ§ μ‘΄μ¬νκΈ°μ λ€λ₯Έ λ―Έμ€νΈλ κΈ°λ° λͺ¨λΈλ€κ³Ό κ²°ν©νλλ°μ μ΄λ €μμ΄ μμμ΅λλ€.
λͺ¨λΈμ κ΄μ¬ κ°μ Έμ£Όμ μ κ°μ¬ν©λλ€ π
Oh I see, congratulations on your great work!
May I ask what weights were used when initializing the 10.7B experts, as I was unaware of other pretrained/fine-tuned models with 10.7B parameters.
Based on my PiVoT-10.7B-Mistral-v0.2-RP, other 10.7B models with RP finetune on huggingface used.