Uploaded model
- Developed by: 1024m
- License: apache-2.0
- Finetuned from model : unsloth/phi-4
This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.
Model tree for 1024m/PHI-4-Hindi-LoRA
Dataset used to train 1024m/PHI-4-Hindi-LoRA
Evaluation results
- accuracy on MMLU Pro (5-Shot)test set Open LLM Leaderboard52.390
- accuracy (normalized) on GPQA (0-Shot)test set Open LLM Leaderboard39.770
- accuracy (normalized) on MuSR (0-Shot)test set Open LLM Leaderboard49.070
- accuracy (normalized) on Big Bench Hard (3-Shot)test set Open LLM Leaderboard66.970
- accuracy (exact match) on Math HARD (4-Shot)test set Open LLM Leaderboard23.110