ptrdvn commited on
Commit
d2818d3
·
verified ·
1 Parent(s): bdbafee

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +20 -0
README.md CHANGED
@@ -131,6 +131,22 @@ This in turns allows our reranker to benefit from improvements to inference as a
131
 
132
  # How to use
133
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
134
  #### vLLM
135
 
136
  Install [vLLM](https://github.com/vllm-project/vllm/) using `pip install vllm`.
@@ -259,8 +275,12 @@ We make our evaluation code and results available [on our Github](https://github
259
 
260
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64b63f8ad57e02621dc93c8b/P-XCA3TGHqDSX8k6c4hCE.png)
261
 
 
 
262
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64b63f8ad57e02621dc93c8b/puhhWseBOcIyOEdW4L-B0.png)
263
 
 
 
264
  # License
265
 
266
  We share this model under an Apache 2.0 license.
 
131
 
132
  # How to use
133
 
134
+ The model was trained to expect an input such as:
135
+
136
+ ```
137
+ <<<Query>>>
138
+ {your_query_here}
139
+
140
+ <<<Context>>>
141
+ {your_context_here}
142
+ ```
143
+
144
+ And to output a string of a number between 1-7.
145
+
146
+ In order to make a continuous score that can be used for reranking query-context pairs (i.e. a method with few ties), we calculate the expectation value of the scores.
147
+
148
+ We include scripts to do this in both vLLM and LMDeploy:
149
+
150
  #### vLLM
151
 
152
  Install [vLLM](https://github.com/vllm-project/vllm/) using `pip install vllm`.
 
275
 
276
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64b63f8ad57e02621dc93c8b/P-XCA3TGHqDSX8k6c4hCE.png)
277
 
278
+ As we can see, this reranker attains greater IR evaluation metrics compared to the two benchmarks we include for all positions apart from @1.
279
+
280
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64b63f8ad57e02621dc93c8b/puhhWseBOcIyOEdW4L-B0.png)
281
 
282
+ We also show that our model is, on average, faster than the BGE reranker v2.
283
+
284
  # License
285
 
286
  We share this model under an Apache 2.0 license.