After the installation of FlashAttention2, the model can be loaded using:

from scooby.modeling import Scooby
flooby_neurips = Scooby.from_pretrained(
    'johahi/neurips-scooby-flash',
    cell_emb_dim=14,
    embedding_dim=1920,
    n_tracks=3,
    return_center_bins_only=True,
    disable_cache=False,
    use_transform_borzoi_emb=False,
)

Since the transform_borzoi_emb is not used, the models performance is slightly (0.52 vs 0.54 across cell types) worse than neurips-scooby.

Downloads last month
6
Safetensors
Model size
189M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .