dnabert-distilbert / README.md
Joana Palés Huix
Update README.md
eb65755
|
raw
history blame
1.15 kB
metadata
tags:
  - DNA
license: mit

DistilDNA model

This is a distilled version of DNABERT by using DistilBERT technique. It has a BERT architecture with 6 layers and 768 hidden units, pre-trained on 6-mer DNA sequences. For more details on the pre-training scheme and methods, please check the original thesis report.

How to Use

The model can be used to fine-tune on a downstream genomic task, e.g. promoter identification.

import torch
from transformers import DistilBertForSequenceClassification

model = DistilBertForSequenceClassification.from_pretrained('Peltarion/dnabert-distilbert')

More details on how to fine-tune the model, dataset and additional source codes are available on github.com/joanaapa/Distillation-DNABERT-Promoter.