omost-phi-3-mini-128k-8bits is Omost's phi-3-mini model with 128k context length in fp8.

Downloads last month
76
Safetensors
Model size
3.82B params
Tensor type
F32
BF16
I8
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model authors have turned it off explicitly.