File size: 728 Bytes
715b3bf
 
 
 
 
 
 
 
a91e4e5
715b3bf
a91e4e5
715b3bf
a91e4e5
1b8aea9
1
2
3
4
5
6
7
8
9
10
11
12
13
14
---
library_name: transformers
tags: []
---

# Model Card for Model ID

<!-- Provide a quick summary of what the model is/does. -->
**LOLA**: Large and Open Source Multilingual Language Model

## Model Description

This is a fine-tuned version of [dice-research/lola_v1](https://huggingface.co./dice-research/lola_v1) trained on [multilingual Alpaca](https://arxiv.org/abs/2309.08958) dataset for 2 epochs. The training dataset can be found here: https://github.com/hplt-project/monolingual-multilingual-instruction-tuning/tree/main/training-data. The following languages are covered:
Bulgarian (bg), Czech (cs), English (en), German (de), Spanish (es), Finnish (fi), French (fr), Portuguese (pt), Russian (ru), and Chinese (zh).