--- language: - "be" tags: - "belarusian" - "pos" - "dependency-parsing" base_model: KoichiYasuoka/ltgbert-base-belarusian-upos datasets: - "universal_dependencies" license: "apache-2.0" pipeline_tag: "token-classification" widget: - text: "全学年にわたって小学校の国語の教科書に挿し絵が用いられている" --- # ltgbert-base-belarusian-ud-goeswith ## Model Description This is a LTG-BERT model pretrained for POS-tagging and dependency-parsing (using `goeswith` for subwords), derived from [ltgbert-base-belarusian-upos](https://huggingface.co./KoichiYasuoka/ltgbert-base-belarusian-upos) and [UD_Belarusian-HSE](https://github.com/UniversalDependencies/Belarusian-HSE). ## How to Use ```py from transformers import pipeline nlp=pipeline("universal-dependencies","KoichiYasuoka/ltgbert-base-belarusian-ud-goeswith",trust_remote_code=True,aggregation_strategy="simple") ```