File size: 1,421 Bytes
022317b
da4da63
 
 
 
 
 
 
022317b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
---
language: 
- multilingual
- en
- fr
- da
- ja
- vi

datasets: wikipedia

license: apache-2.0
---

# bert-base-en-fr-da-ja-vi-cased

We are sharing smaller versions of [bert-base-multilingual-cased](https://huggingface.co./bert-base-multilingual-cased) that handle a custom number of languages.

Unlike [distilbert-base-multilingual-cased](https://huggingface.co./distilbert-base-multilingual-cased), our versions give exactly the same representations produced by the original model which preserves the original accuracy.


For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).

## How to use

```python
from transformers import AutoTokenizer, AutoModel

tokenizer = AutoTokenizer.from_pretrained("Geotrend/bert-base-en-fr-da-ja-vi-cased")
model = AutoModel.from_pretrained("Geotrend/bert-base-en-fr-da-ja-vi-cased")

```

To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).

### How to cite

```bibtex
@inproceedings{smallermbert,
  title={Load What You Need: Smaller Versions of Mutlilingual BERT},
  author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
  booktitle={SustaiNLP / EMNLP},
  year={2020}
}
```

## Contact 

Please contact [email protected] for any question, feedback or request.