RaphaelMourad
commited on
Commit
•
d4d5e8c
1
Parent(s):
4803bee
Update README.md
Browse files
README.md
CHANGED
@@ -11,7 +11,7 @@ tags:
|
|
11 |
|
12 |
The Mistral-Codon-v1-16M Large Language Model (LLM) is a pretrained generative DNA sequence model with 16M parameters.
|
13 |
It is derived from Mixtral-8x7B-v0.1 model, which was simplified for DNA: the number of layers and the hidden size were reduced.
|
14 |
-
The model was pretrained using 24M coding DNA sequences from many different species (vertebrates, plants, bacteria, viruses, ...).
|
15 |
|
16 |
## Model Architecture
|
17 |
|
|
|
11 |
|
12 |
The Mistral-Codon-v1-16M Large Language Model (LLM) is a pretrained generative DNA sequence model with 16M parameters.
|
13 |
It is derived from Mixtral-8x7B-v0.1 model, which was simplified for DNA: the number of layers and the hidden size were reduced.
|
14 |
+
The model was pretrained using 24M coding DNA sequences (300bp) from many different species (vertebrates, plants, bacteria, viruses, ...).
|
15 |
|
16 |
## Model Architecture
|
17 |
|