Add to Huggingface open leaderboard 2
I am sure, if you added this model to the leaderboard, it will receive more publicity, as there are actually not many models below 7b active parameters.
See https://huggingface.co./spaces/open-llm-leaderboard/open_llm_leaderboard
Anyone is welcome to add it; it seems like the LB does not support filtering by active params tho
No, it does not support it yet.
I know of the GGUF naming convention, which follows mixtral's naming scheme, but I honestly also like what Qwen1.5-MoE-A2.7B did with "A", so it is very clear how many parameters are actually activated. If one searches for MoE, it shows most MoE models, then users can remember their score and compare with other models. It's a little messy and kinda works, but ideally, developers of language models should follow a common standard. It's fine if metadata is written into the model files, the accompanying configuration files or the README instead of the name of the model, as long as a search feature can access that data.
It should also be clear for this model i.e. 1B are activated, 7B in total; The Qwen naming scheme does not make total params as clear which is inconvenient i think
The current name could be interpreted to mean that 1B total is activated, but it is made up of multiple smaller sub 1B parameters.
e.g. 2x 0.5b = 1b parameters activated, but 14 x 0.5 = 7b parameters in total.
Obviously it is clear from your model card that this is not the case, but when looking at the name alone, it is not clear.
Edit: But we are digressing. Sorry :D
Adding to https://huggingface.co./spaces/open-llm-leaderboard/open_llm_leaderboard is blocked by "Model "allenai/OLMoE-1B-7B-0924-Instruct" needs to be launched with trust_remote_code=True
. For safety reason, we do not allow these models to be automatically submitted to the leaderboard."
Maybe the open-llm-leaderboard is running on an older version of the transformers library.
The model has been added to the transformers library here: https://github.com/huggingface/transformers/pull/32406
Yes. The leaderboard currently runs transformers==4.44.2 (see Requirements.txt), but would require https://github.com/huggingface/transformers/tree/v4.45.0