How to Use

Load model for inference

import torch
from transformers import AutoModel

model = AutoModel.from_pretrained("genbio-ai/dummy-ckpt-hf", trust_remote_code=True)

collated_batch = model.genbio_model.collate({"sequences": ["ACGT", "AGCT"]})
logits = model(collated_batch)
print(logits)
print(torch.argmax(logits, dim=-1))
Downloads last month
4
Safetensors
Model size
4.55M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.