mamba2-370m-hf

Converted files of the original model at mamba2-370m to HF transformers compatible formats. Not affiliated with both the original authors or hf.

Usage

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("AntonV/mamba2-370m-hf")
model = AutoModelForCausalLM.from_pretrained("AntonV/mamba2-370m-hf")

input_ids = tokenizer("Hey how are you doing?", return_tensors="pt")["input_ids"]
out = model.generate(input_ids, max_new_tokens=10)
print(tokenizer.batch_decode(out))

Citation

BibTeX:

@inproceedings{mamba2,
 title={Transformers are {SSM}s: Generalized Models and Efficient Algorithms Through Structured State Space Duality},
 author={Dao, Tri and Gu, Albert},
 booktitle={International Conference on Machine Learning (ICML)},
 year={2024}
}
Downloads last month
129
Safetensors
Model size
368M params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model’s pipeline type.

Collection including AntonV/mamba2-370m-hf