German BERT large

Released, Oct 2020, this is a German BERT language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model and show that it outperforms its predecessors.

Overview

Paper: here
Architecture: BERT large
Language: German

Performance

GermEval18 Coarse: 80.08
GermEval18 Fine:   52.48
GermEval14:        88.16

See also:
deepset/gbert-base
deepset/gbert-large
deepset/gelectra-base
deepset/gelectra-large
deepset/gelectra-base-generator
deepset/gelectra-large-generator

Authors

Branden Chan: [email protected]
Stefan Schweter: [email protected]
Timo Möller: [email protected]

About us

deepset is the company behind the production-ready open-source AI framework Haystack.

Some of our other work:

Get in touch and join the Haystack community

For more info on Haystack, visit our GitHub repo and Documentation.

We also have a Discord community open to everyone!

Twitter | LinkedIn | Discord | GitHub Discussions | Website | YouTube

By the way: we're hiring!

Downloads last month
14,017
Safetensors
Model size
337M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for deepset/gbert-large

Finetunes
13 models

Datasets used to train deepset/gbert-large

Spaces using deepset/gbert-large 5