---
language:
- en
tags:
- sentence-transformers
- cross-encoder
- text-classification
- generated_from_trainer
- dataset_size:78704
- loss:ListNetLoss
base_model: microsoft/MiniLM-L12-H384-uncased
datasets:
- microsoft/ms_marco
pipeline_tag: text-classification
library_name: sentence-transformers
metrics:
- map
- mrr@10
- ndcg@10
co2_eq_emissions:
emissions: 201.83156300124415
energy_consumed: 0.519244982020273
source: codecarbon
training_type: fine-tuning
on_cloud: false
cpu_model: 13th Gen Intel(R) Core(TM) i7-13700K
ram_total_size: 31.777088165283203
hours_used: 1.659
hardware_used: 1 x NVIDIA GeForce RTX 3090
model-index:
- name: CrossEncoder based on microsoft/MiniLM-L12-H384-uncased
results: []
---
# CrossEncoder based on microsoft/MiniLM-L12-H384-uncased
This is a [Cross Encoder](https://www.sbert.net/docs/cross_encoder/usage/usage.html) model finetuned from [microsoft/MiniLM-L12-H384-uncased](https://huggingface.co./microsoft/MiniLM-L12-H384-uncased) on the [ms_marco](https://huggingface.co./datasets/microsoft/ms_marco) dataset using the [sentence-transformers](https://www.SBERT.net) library. It computes scores for pairs of texts, which can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Cross Encoder
- **Base model:** [microsoft/MiniLM-L12-H384-uncased](https://huggingface.co./microsoft/MiniLM-L12-H384-uncased)
- **Maximum Sequence Length:** 512 tokens
- **Number of Output Labels:** 1 label
- **Training Dataset:**
- [ms_marco](https://huggingface.co./datasets/microsoft/ms_marco)
- **Language:** en
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Documentation:** [Cross Encoder Documentation](https://www.sbert.net/docs/cross_encoder/usage/usage.html)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Cross Encoders on Hugging Face](https://huggingface.co./models?library=sentence-transformers&other=cross-encoder)
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import CrossEncoder
# Download from the 🤗 Hub
model = CrossEncoder("tomaarsen/reranker-msmarco-v1.1-MiniLM-L12-H384-uncased-listnet-sigmoid-scale-10")
# Get scores for pairs of texts
pairs = [
['How many calories in an egg', 'There are on average between 55 and 80 calories in an egg depending on its size.'],
['How many calories in an egg', 'Egg whites are very low in calories, have no fat, no cholesterol, and are loaded with protein.'],
['How many calories in an egg', 'Most of the calories in an egg come from the yellow yolk in the center.'],
]
scores = model.predict(pairs)
print(scores.shape)
# (3,)
# Or rank different texts based on similarity to a single text
ranks = model.rank(
'How many calories in an egg',
[
'There are on average between 55 and 80 calories in an egg depending on its size.',
'Egg whites are very low in calories, have no fat, no cholesterol, and are loaded with protein.',
'Most of the calories in an egg come from the yellow yolk in the center.',
]
)
# [{'corpus_id': ..., 'score': ...}, {'corpus_id': ..., 'score': ...}, ...]
```
## Evaluation
### Metrics
#### Cross Encoder Reranking
* Datasets: `NanoMSMARCO`, `NanoNFCorpus` and `NanoNQ`
* Evaluated with [CrossEncoderRerankingEvaluator
](https://sbert.net/docs/package_reference/cross_encoder/evaluation.html#sentence_transformers.cross_encoder.evaluation.CrossEncoderRerankingEvaluator)
| Metric | NanoMSMARCO | NanoNFCorpus | NanoNQ |
|:------------|:---------------------|:---------------------|:---------------------|
| map | 0.5122 (+0.0226) | 0.3306 (+0.0696) | 0.5716 (+0.1520) |
| mrr@10 | 0.5044 (+0.0269) | 0.5401 (+0.0403) | 0.5754 (+0.1487) |
| **ndcg@10** | **0.5840 (+0.0435)** | **0.3676 (+0.0425)** | **0.6431 (+0.1425)** |
#### Cross Encoder Nano BEIR
* Dataset: `NanoBEIR_R100_mean`
* Evaluated with [CrossEncoderNanoBEIREvaluator
](https://sbert.net/docs/package_reference/cross_encoder/evaluation.html#sentence_transformers.cross_encoder.evaluation.CrossEncoderNanoBEIREvaluator)
| Metric | Value |
|:------------|:---------------------|
| map | 0.4715 (+0.0814) |
| mrr@10 | 0.5400 (+0.0720) |
| **ndcg@10** | **0.5316 (+0.0762)** |
## Training Details
### Training Dataset
#### ms_marco
* Dataset: [ms_marco](https://huggingface.co./datasets/microsoft/ms_marco) at [a47ee7a](https://huggingface.co./datasets/microsoft/ms_marco/tree/a47ee7aae8d7d466ba15f9f0bfac3b3681087b3a)
* Size: 78,704 training samples
* Columns: query
, docs
, and labels
* Approximate statistics based on the first 1000 samples:
| | query | docs | labels |
|:--------|:-----------------------------------------------------------------------------------------------|:------------------------------------|:------------------------------------|
| type | string | list | list |
| details |
average temperature in may for denver colorado
| ["In most years, Denver averages a daily maximum temperature for May that's between 67 and 74 degrees Fahrenheit (19 to 23 degrees Celsius). The minimum temperature usually falls between 42 and 46 °F (5 to 8 °C). The days at Denver warm quickly during May.", 'The highest average temperature in Denver is July at 74 degrees. The coldest average temperature in Denver is December at 28.5 degrees. The most monthly precipitation in Denver occurs in August with 2.7 inches. The Denver weather information is based on the average of the previous 3-7 years of data.', "Climate for Denver, Colorado. Denver's coldest month is January when the average temperature overnight is 15.2°F. In July, the warmest month, the average day time temperature rises to 88.0°F.", "Average Temperatures for Denver. Denver's coldest month is January when the average temperature overnight is 15.2°F. In July, the warmest month, the average day time temperature rises to 88.0°F.", 'Location. This report describes the typical...
| [1, 0, 0, 0, 0, ...]
|
| what is brain surgery
| ['The term “brain surgery” refers to various medical procedures that involve repairing structural problems with the brain. There are numerous types of brain surgery. The type used is based on the area of the brain and condition being treated. Advances in medical technology let surgeons operate on portions of the brain without a single incision near the head. Brain surgery is a critical and complicated process. The type of brain surgery done depends highly on the condition being treated. For example, a brain aneurysm is typically repaired using an endoscope, but if it has ruptured, a craniotomy may be used.', 'Brain surgery is an operation to treat problems in the brain and surrounding structures. Before surgery, the hair on part of the scalp is shaved and the area is cleaned. The doctor makes a surgical cut through the scalp. The location of this cut depends on where the problem in the brain is located. The surgeon creates a hole in the skull and removes a bone flap.', 'Brain Surgery –...
| [1, 0, 0, 0, 0, ...]
|
| whos the girl in terminator genisys
| ['Over the weekend, Terminator Genisys grossed $28.7 million to take the third spot at the box office, behind Jurassic World and Inside Out. FYI: Emilia is wearing Dior. 10+ pictures inside of Emilia Clarke and Arnold Schwarzenegger hitting the Terminator Genisys premiere in Japan…. Emilia Clarke is red hot while attending the premiere of her new film Terminator Genisys held at the Roppongi Hills Arena on Monday (July 6) in Tokyo, Japan.', "Jai Courtney, who plays Sarah's protector Kyle Reese (and eventual father to Jason Clarke 's John Connor), revealed that this role was the first time a character he played has fallen in love on screen. I had never fallen in love on screen before.", 'When John Connor (Jason Clarke), leader of the human resistance, sends Sgt. Kyle Reese (Jai Courtney) back to 1984 to protect Sarah Connor (Emilia Clarke) and safeguard the future, an unexpected turn of events creates a fractured timeline.', "On the run from the Terminator, Reese and Sarah share a night ...
| [1, 0, 0, 0, 0, ...]
|
* Loss: [ListNetLoss
](https://sbert.net/docs/package_reference/cross_encoder/losses.html#listnetloss) with these parameters:
```json
{
"pad_value": -1,
"activation_fct": "torch.nn.modules.activation.Sigmoid"
}
```
### Evaluation Dataset
#### ms_marco
* Dataset: [ms_marco](https://huggingface.co./datasets/microsoft/ms_marco) at [a47ee7a](https://huggingface.co./datasets/microsoft/ms_marco/tree/a47ee7aae8d7d466ba15f9f0bfac3b3681087b3a)
* Size: 1,000 evaluation samples
* Columns: query
, docs
, and labels
* Approximate statistics based on the first 1000 samples:
| | query | docs | labels |
|:--------|:-----------------------------------------------------------------------------------------------|:------------------------------------|:------------------------------------|
| type | string | list | list |
| details | lpn salary richmond va
| ['$52,000. Average LPN salaries for job postings in Richmond, VA are 1% higher than average LPN salaries for job postings nationwide.', 'A Licensed Practical Nurse (LPN) in Richmond, Virginia earns an average wage of $18.47 per hour. For the first five to ten years in this position, pay increases somewhat, but any additional experience does not have a big effect on pay. $27,369 - $48,339. (Median).', 'Virginia has a growing number of opportunities in the nursing field. Within the state, LPNs make up 25 % of nurses in the state. The Virginia LPN comfort score is 54. This takes into account the average LPN salary, average state salary and cost of living.', 'LPN Salaries and Career Outlook in Richmond. Many LPN graduates choose to work as licensed practical nurses after graduation. If you choose to follow that path and remain in Richmond, your job prospects are good. In 2010, of the 20,060 licensed practical nurses in Virginia, 370 were working in the greater Richmond area.', 'This chart ...
| [1, 0, 0, 0, 0, ...]
|
| what is neutrogena
| ["Neutrogena is an American brand of skin care, hair care and cosmetics, that is headquartered in Los Angeles, California. According to product advertising at their website, Neutrogena products are distributed in more than 70 countries. Neutrogena was founded in 1930 by Emanuel Stolaroff, and was originally a cosmetics company named Natone. In 1994 Johnson & Johnson acquired Neutrogena for $924 million, at a price of $35.25 per share. Johnson & Johnson's international network helped Neutrogena boost its sales and enter newer markets including India, South Africa, and China. Priced at a premium, Neutrogena products are distributed in over 70 countries.", 'Neutrogena also has retinol products for treating acne that have one thing going for them that most brands do not—they are in the kind of package that keeps the retinol cream fresh and active. Any kind of vitamin you dip out of jar will go bad almost as soon as you open the container due to oxidation.ost of the products Neutrogena make...
| [1, 0, 0, 0, 0, ...]
|
| why is lincoln a great leader
| ['His commitment to the rights of individuals was a cornerstone of his leadership style (Phillips, 1992). There have been many great leaders throughout the history of this great nation, but Abraham Lincoln is consistently mentioned as one of our greatest leaders. Although Lincoln possessed many characteristics of a great leader, probably his greatest leadership trait was his ability to communicate. Though Lincoln only had one year of formal education, he was able to master language and use his words to influence the people as a great public speaker, debater and as a humorist. Another part of Lincoln’s skills as a great communicator, was that he had a great capacity for learning to listen to different points of view. While president, he created a work environment where his cabinet members were able to disagree with his decisions without the threat of retaliation for doing so.', 'Expressed in his own words, here is Lincoln’s most luminous leadership insight by far: In order to win a man ...
| [1, 0, 0, 0, 0, ...]
|
* Loss: [ListNetLoss
](https://sbert.net/docs/package_reference/cross_encoder/losses.html#listnetloss) with these parameters:
```json
{
"pad_value": -1,
"activation_fct": "torch.nn.modules.activation.Sigmoid"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 6
- `per_device_eval_batch_size`: 16
- `learning_rate`: 2e-05
- `warmup_ratio`: 0.1
- `seed`: 12
- `bf16`: True
- `load_best_model_at_end`: True
#### All Hyperparameters