|
--- |
|
library_name: transformers |
|
license: apache-2.0 |
|
datasets: |
|
- Manual-Dataset-Creation-Project/Malum-230 |
|
- llm-jp/oasst2-33k-ja |
|
language: |
|
- ja |
|
base_model: |
|
- Qwen/Qwen2.5-7B |
|
inference: false |
|
--- |
|
|
|
# Matsu-7B |
|
|
|
## Description |
|
Matsu-7B is a model that was instruction-tuned on the oasst2 and Malum-230, using Qwen2.5-7B as its base model. |
|
|
|
## Series |
|
| Variant | Link | |
|
| --- | --- | |
|
| Malum-230 | [Manual-Dataset-Creation-Project/Malum-230](https://huggingface.co./datasets/Manual-Dataset-Creation-Project/Malum-230) | |
|
| Take-7B | [Manual-Dataset-Creation-Project/Take-7B](https://huggingface.co./Manual-Dataset-Creation-Project/Take-7B) | |
|
|
|
## Contributors |
|
- [Sudy](https://huggingface.co./sudy-super) |
|
- [ほーりーふぉっくす](https://huggingface.co./Holy-fox) |
|
|
|
## Acknowledgments |
|
We would like to express our gratitude to [VOLTMIND](https://voltmind.jp/) for providing the computational resources used to train this model. |