[FLAG] DopeorNope/COKAL-v1-70B

#566
by distantquant - opened

This model has a very suspiciously high ARC that makes no sense compared to its other benchmarks.

Open LLM Leaderboard org

Hi @distantquant ,
Thanks for your interest!
@DopeorNope , do you know if there is anything in your training data that could possibly have been contaminated on ARC?

It is based on the removed tigerbot model, as they share a same vocab-extended LLaMA2 tokenizer, and the high ARC.

BTW: please also look into my previous SPAM reports, as I am not receiving any feedback from a hf staff: https://huggingface.co./datasets/shapekapseln/ShapeKapselnErfahrungen/discussions/1#657fe2a2f5eacd4bda325db3 https://huggingface.co./spaces/rolexszz/1/discussions/1#657f7e74d70b7308f3a6a525

that tigerbot model was contaminated, so it is safe to assume this one is as well

@clefourrier

Hello!

First of all, I might not remember everything since I'm based on an older model, but I'll do my best to provide accurate and sincere answers to what I can do.

Firstly, I have never used any corrupted datasets. The reason is that it's meaningless, and I know that using such data doesn't make my model any better. Moreover, I understand that doing so would not be suitable for future value.

At that time, I probably used the Platypus dataset.

I can't check the base model since there's no log available, but a friend mentioned above has answered it.

Anyway, I wasn't paying much attention to this lower-ranked model, but I'm really grateful for your mention.

=========================================================================================================

@distantquant @JosephusCheung Thank you for your dedication to open source!

====================================================================================================

@clefourrier

Btw, I have a question about a model called 'DopeorNope/SOLARC-MOE-10.7Bx6.' This is a model that I fine-tuned and developed using the MoE (Mixture of Experts) approach. However, it's currently categorized under 'merge/moerge'.

I've mentioned this before, as it was initially categorized as MoE but now it's changed. Can you correct this?

Open LLM Leaderboard org

Hi!
Given your answer, and the fact that it was based on a dataset which seems like it was contaminated at the time, I'll flag it to avoid other users building on top of it.

@JosephusCheung I informed the relevant team internally, but please avoid using other discussions for this.
@DopeorNope Regarding your other model, it's tagged as moerge as it is an MoE which includes merged models.

clefourrier changed discussion status to closed

@clefourrier then I’ll add it MoE tag.

Even if I didn't use the merged model as is and went through tuning, would it still be classified as moerge? According to your explanation at the time, it was supposed to be classified both as MoE and merge, but it's classified as moerge. Is it possible to classify it as MoE as well?

Open LLM Leaderboard org

Hi @DopeorNope ,
Your model is already tagged both as an MoE and as a moerge, as any MoE which contains merges is both. You can check this by looking for your model when no filter is applied (unchecking all boxes on the bottom left), then remove MoEs from the display by checking the box (which will hide your model), and same for moerges.
If you want to change its model type (top left checkboxes) to base merges and moerges, please open a PR on the request file.

Here is the explanation I gave on twitter:

- Merging several different base models: it's a merge!
- Using models as experts in MoE: it's a MoErge! 
- Fine-tuning a merge or MoErge: it's a fine-tune, but it should have merge in its metadata (as some users avoid merges altogether)

@clefourrier

image.png

Thank you for your kind response..!

However, there are some issues!

Firstly, I can see my model on the leaderboard with the merge, moerge tags, but currently, my model is not visible with the MoE tag.

Can this be reflected?

Thank you always for answering so many questions.

Open LLM Leaderboard org
edited Feb 1

Hi! This is normal, since your model is both a moerge and a MoE, if you hide models which contain a merge/moerge, your model will be hidden. You need to show both merge/moerge and MoE models to show your model.

Sign up or log in to comment