BTLM-3B GGML
Bittensor Language Model (BTLM-3B-8k-base) is a 3 billion parameter language model with an 8k context length trained on 627B tokens of SlimPajama.
This is just model conversion into ggml format, model is not actaully implemented so you cannot use it. Stay tuned!
Ref: https://huggingface.co./cerebras/btlm-3b-8k-base GGML ISSUE https://github.com/ggerganov/ggml/issues/427
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.