dnabert-minilm / README.md
Joana Palés Huix
Update README.md
a9be449
|
raw
history blame
868 Bytes
metadata
tags:
  - DNA
license: mit

MiniDNA model

This is a distilled version of DNABERT by using MiniLM technique. It has a BERT architecture with 6 layers and 768 hidden units, pre-trained on 6-mer DNA sequences. For more details on the pre-training scheme and methods, please check the original thesis report [link to be added].

How to Use

The model can be used to fine-tune on a downstream genomic task, e.g. promoter identification.

import torch
from transformers import BertForSequenceClassification
model = BertForSequenceClassification.from_pretrained('Peltarion/dnabert-minilm')

More details on how to fine-tune the model, dataset and additional source codes are available on github.com/joanaapa/Distillation-DNABERT-Promoter.