omega-3b / README.md
winglian's picture
Create README.md
eca5db0
|
raw
history blame
526 Bytes
metadata
datasets:
  - nampdn-ai/tiny-textbooks
  - nampdn-ai/tiny-lessons
language:
  - en

Built with Axolotl

Omega 2.6B

This model is derived from phi 1.3B using layer stacking techniques to double the number of hidden layers in the model. The model was then trained for 1 epoch on data from tiny-textbooks and tiny-lessons.