Spaces:
Running
on
CPU Upgrade
Models not evaluated
So I have some models which aren't evaluated. They aren't listed anywhere not in finished, pending or evaluated. I generated these models the same way I did others before so the problem shouldn't be with them. The things is the leaderboard wasn't running shortly before and after I submitted, so I guess the window in between which I used for the submit was to short? I can't resubmit because I get the error that it was already submitted, that's the only evidence I have that my model was in fact submitted before despite it doesn't appear in any of the lists. I could give the model name if that's of any use. It's not just one but multiple which were submitted nearly at the same time.
@Yuma42 That's interesting. I'm not staff, but can you provide links to the models?
Edit: I noticed some of your models were mergers. I unselected "contains a merge" and they showed up. The FAQ on the leaderboard explains why it's done this way.
@Yuma42 That's interesting. I'm not staff, but can you provide links to the models?
Edit: I noticed some of your models were mergers. I unselected "contains a merge" and they showed up. The FAQ on the leaderboard explains why it's done this way.
All my models are merges and the ones which show up are my older ones. To citate myself: "I generated these models the same way I did others before".
One example of model which doesn't appear is "Yuma42/KangalKhan-Alpha-Emerald-7B". All my models of that day have "Alpha" in their name, and none of them show up.
@Yuma42 It looks like they failed. According to the FAQ on the leaderboard you need to include the links to the evaluation files, so here are the 4 alphas.
https://huggingface.co./datasets/open-llm-leaderboard/requests/tree/main/Yuma42
@Yuma42 It looks like they failed.
https://huggingface.co./datasets/open-llm-leaderboard/requests/tree/main/Yuma42
https://huggingface.co./datasets/open-llm-leaderboard/requests/tree/main/Yuma42
Thanks now I know at least about that.
Hi to you both!
Thanks @Phil337 for providing the links to the request files, that's a time saver!
@Yuma42 , your models failed with the following error at loading time
ValueError: Trying to set a tensor of shape torch.Size([32000, 4096]) in "weight" (which has shape torch.Size([32002, 4096])), this look incorrect.
I suspect you have a mismatch between your expected and actual vocabulary size.
I'm going to close this issue, but feel free to reopen once you fixed your models and I'll change the request files so you can resubmit.