mabornea's picture
Update README.md
224e6cb
---
license: apache-2.0
---
# Model Description
This is a retriever model based on ColBERT v2 with [bert-base-uncased](https://huggingface.co./bert-base-uncased) language model.<br>
This model was trained with the OpenNQ data.<br>
The architecture of the model and hyper parameters are described in the paper ‘Relevance-guided Supervision for OpenQA with ColBERT’.
## Intended uses & limitations
This model uses the xlm-roberta-large LM. Biases associated with the pre-trained language model we used may be present in this ColBERT v2 model.
## Usage
This model can be used with [PrimeQA](https://github.com/primeqa/primeqa)’s [ColBERT](https://github.com/primeqa/primeqa/blob/main/primeqa/ir/README.md) engine.
## BibTeX entry and citation info
```bibtex
@article{Khattab2021RelevanceguidedSF,
title={Relevance-guided Supervision for OpenQA with ColBERT},
author={O. Khattab and Christopher Potts and Matei A. Zaharia},
journal={Transactions of the Association for Computational Linguistics},
year={2021},
}
```
```bibtex
@article{Lee2019LatentRF,
title={Latent Retrieval for Weakly Supervised Open Domain Question Answering},
author={Kenton Lee and Ming-Wei Chang and Kristina Toutanova},
journal={ACL},
year={2019}
}
```