File size: 294 Bytes
289813c 92f8885 ee6fe96 |
1 2 3 4 5 6 7 |
---
license: mit
---
This is the float32 110M parameter Llama 2 architecture model trained on the TinyStories dataset.
These are converted from
[karpathy/tinyllamas](https://huggingface.co./karpathy/tinyllamas).
See the [llama2.c](https://github.com/karpathy/llama2.c) project for more details. |