Description
Best-performing "mBERT-qa-en, skd, mAP@k" model from the paper Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation.
Check the official GitHub repository to access the code used to implement the methods in the paper.
More info coming soon!
How to Cite
To cite our work use the following BibTex:
@misc{carrino2023promoting,
title={Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation},
author={Casimiro Pio Carrino and Carlos Escolano and José A. R. Fonollosa},
year={2023},
eprint={2309.17134},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
- Downloads last month
- 119
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no pipeline_tag.