inference: false | |
<br> | |
<br> | |
# LWM-Text-Chat-512K Model Card | |
## Model details | |
**Model type:** | |
LWM-Text-Chat-512K is an open-source model trained from LLaMA-2 on a subset of Books3 filtered data. It is an auto-regressive language model, based on the transformer architecture. | |
**Model date:** | |
LWM-Text-Chat-512K was trained in December 2023. | |
**Paper or resources for more information:** | |
https://largeworldmodel.github.io/ | |
## License | |
Llama 2 is licensed under the LLAMA 2 Community License, | |
Copyright (c) Meta Platforms, Inc. All Rights Reserved. | |
**Where to send questions or comments about the model:** | |
https://github.com/LargeWorldModel/lwm/issues | |
## Training dataset | |
- 3500 subset of Books3 documents with 500K to 1M tokens |