Edit model card

Adapter roberta-large-qnli_houlsby for roberta-large

QNLI adapter (with head) trained using the run_glue.py script with an extension that retains the best checkpoint (out of 15 epochs).

This adapter was created for usage with the Adapters library.

Usage

First, install adapters:

pip install -U adapters

Now, the adapter can be loaded and activated like this:

from adapters import AutoAdapterModel

model = AutoAdapterModel.from_pretrained("roberta-large")
adapter_name = model.load_adapter("AdapterHub/roberta-large-qnli_houlsby")
model.set_active_adapters(adapter_name)

Architecture & Training

  • Adapter architecture: houlsby
  • Prediction head: classification
  • Dataset: QNLI

Author Information

Citation

@article{pfeiffer2020AdapterHub,
    title={AdapterHub: A Framework for Adapting Transformers},
    author={Jonas Pfeiffer,
            Andreas R\"uckl\'{e},
            Clifton Poth,
            Aishwarya Kamath,
            Ivan Vuli\'{c},
            Sebastian Ruder,
            Kyunghyun Cho,
            Iryna Gurevych},
    journal={ArXiv},
    year={2020}
}

This adapter has been auto-imported from https://github.com/Adapter-Hub/Hub/blob/master/adapters/ukp/roberta-large-qnli_houlsby.yaml.

Downloads last month
4
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.