Merge branch 'main' of https://huggingface.co./UWB-AIR/Czert-B-base-cased-long-zero-shot into main
Browse files
README.md
CHANGED
@@ -1,9 +1,14 @@
|
|
|
|
|
|
|
|
|
|
|
|
1 |
# CZERT
|
2 |
This repository keeps trained Czert-B-base-cased-long-zero-shot model for the paper [Czert – Czech BERT-like Model for Language Representation
|
3 |
](https://arxiv.org/abs/2103.13031)
|
4 |
For more information, see the paper
|
5 |
|
6 |
-
This is long version of Czert-B-base-cased created without any finetunning on long documents. Positional embedings were created by simply repeating the positional embeddings of the original Czert-B model
|
7 |
|
8 |
|
9 |
## Available Models
|
|
|
1 |
+
---
|
2 |
+
tags:
|
3 |
+
- cs
|
4 |
+
- fill-mask
|
5 |
+
---
|
6 |
# CZERT
|
7 |
This repository keeps trained Czert-B-base-cased-long-zero-shot model for the paper [Czert – Czech BERT-like Model for Language Representation
|
8 |
](https://arxiv.org/abs/2103.13031)
|
9 |
For more information, see the paper
|
10 |
|
11 |
+
This is long version of Czert-B-base-cased created without any finetunning on long documents. Positional embedings were created by simply repeating the positional embeddings of the original Czert-B model. For tokenization, please use BertTokenizer. Cannot be used with AutoTokenizer.
|
12 |
|
13 |
|
14 |
## Available Models
|