shivendrra commited on
Commit
01ea1ce
1 Parent(s): e9e5531

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -21,16 +21,16 @@ It also has one more BERT based model that has 47million parameters, also capabl
21
  ### Model Description
22
 
23
  - **Developed by:** [Shivendra Singh]()
24
- - **License:** [MIT]
25
 
26
  ### Model Sources
27
 
28
  - **Repository:** [github/enigma-1.5b](https://github.com/shivendrra/enigma-1.5b)
29
- - **Papers**: [# Understanding the Natural Language of DNA using Encoder-Decoder Foundation Models with Byte-level Precision](https://arxiv.org/html/2311.02333v2#bib.bib35
30
 
31
  ## Uses
32
 
33
- Can be used to generate new sequences of DNA on a given input of tokens. Or can be used for further research. Anyway, it's very basic in nature.
34
  ### Direct Use
35
 
36
  Load the model and then can be used to generate new sequences, `max_length=512` for 2.5b model and `256` for enbert-47m model.
 
21
  ### Model Description
22
 
23
  - **Developed by:** [Shivendra Singh]()
24
+ - **License:** MIT
25
 
26
  ### Model Sources
27
 
28
  - **Repository:** [github/enigma-1.5b](https://github.com/shivendrra/enigma-1.5b)
29
+ - **Papers**: [Understanding the Natural Language of DNA using Encoder-Decoder Foundation Models with Byte-level Precision](https://arxiv.org/html/2311.02333v2#bib.bib35
30
 
31
  ## Uses
32
 
33
+ Can be used to generate new sequences of DNA on a given input of tokens. Or can be used for further research. Anyway, it's very basic in nature. I'll add more functionalities which includes classification of dna, masked token generation, etc. Maybe even implement MOE techinque in future.
34
  ### Direct Use
35
 
36
  Load the model and then can be used to generate new sequences, `max_length=512` for 2.5b model and `256` for enbert-47m model.