Xunzi-Qwen1.5-7B-upos

Model Description

This is a Qwen1.5 model pre-trained on Classical Chinese texts for POS-tagging, derived from Xunzi-Qwen1.5-7B. Every word is tagged by UPOS (Universal Part-Of-Speech) and FEATS.

How to Use

from transformers import pipeline
nlp=pipeline("upos","KoichiYasuoka/Xunzi-Qwen1.5-7B-upos",trust_remote_code=True,aggregation_strategy="simple")
print(nlp("不入虎穴不得虎子"))

Reference

安岡孝一: GPT系モデルの系列ラベリングによる品詞付与, 東洋学へのコンピュータ利用, 第38回研究セミナー (2024年7月26日), pp.3-10.

Downloads last month
4
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train KoichiYasuoka/Xunzi-Qwen1.5-7B-upos