tomaarsen HF staff commited on
Commit
9416c5a
·
verified ·
1 Parent(s): 755187b

Add new SentenceTransformer model

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,819 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ tags:
5
+ - sentence-transformers
6
+ - sentence-similarity
7
+ - feature-extraction
8
+ - generated_from_trainer
9
+ - dataset_size:100231
10
+ - loss:DebiasedMultipleNegativesRankingLoss
11
+ base_model: answerdotai/ModernBERT-base
12
+ widget:
13
+ - source_sentence: who led the army that defeated the aztecs
14
+ sentences:
15
+ - Spanish conquest of the Aztec Empire The Spanish conquest of the Aztec Empire,
16
+ or the Spanish-Aztec War (1519-21)[3] was one of the most significant and complex
17
+ events in world history. There are multiple sixteenth-century narratives of the
18
+ events by Spanish conquerors, their indigenous allies, and the defeated Aztecs.
19
+ It was not solely a contest between a small contingent of Spaniards defeating
20
+ the Aztec Empire, but rather the creation of a coalition of Spanish invaders with
21
+ tributaries to the Aztecs, and most especially the Aztecs' indigenous enemies
22
+ and rivals. They combined forces to defeat the Mexica of Tenochtitlan over a two-year
23
+ period. For the Spanish, the expedition to Mexico was part of a project of Spanish
24
+ colonization of the New World after twenty-five years of permanent Spanish settlement
25
+ and further exploration in the Caribbean. The Spanish made landfall in Mexico
26
+ in 1517. A Spanish settler in Cuba, Hernán Cortés, led an expedition (entrada)
27
+ to Mexico, landing in February 1519, following an earlier expedition led by Juan
28
+ de Grijalva to Yucatán in 1517. Two years later Cortés and his retinue set sail,
29
+ thus beginning the expedition of exploration and conquest.[4] The Spanish campaign
30
+ against the Aztec Empire had its final victory on August 13, 1521, when a coalition
31
+ army of Spanish forces and native Tlaxcalan warriors led by Cortés and Xicotencatl
32
+ the Younger captured the emperor Cuauhtemoc and Tenochtitlan, the capital of the
33
+ Aztec Empire. The fall of Tenochtitlan marks the beginning of Spanish rule in
34
+ central Mexico, and they established their capital of Mexico City on the ruins
35
+ of Tenochtitlan.
36
+ - The Girl with All the Gifts Justineau awakens in the Rosalind Franklin. Melanie
37
+ leads her to a group of intelligent hungries, to whom Justineau, wearing an environmental
38
+ protection suit, starts teaching the alphabet.
39
+ - 'Wendy Makkena In 1992 she had a supporting role in the movie Sister Act as the
40
+ shy but talented singing nun Sister Mary Robert, a role she reprised in Sister
41
+ Act 2: Back in the Habit the following year. She appeared in various other television
42
+ roles until 1997, when she starred in Air Bud, followed by the independent film
43
+ Finding North. She continued appearing on television shows such as The Job, Oliver
44
+ Beene, and Listen Up![citation needed]'
45
+ - source_sentence: who went to the most nba finals in a row
46
+ sentences:
47
+ - List of NBA franchise post-season streaks The San Antonio Spurs hold the longest
48
+ active consecutive playoff appearances with 21 appearances, starting in the 1998
49
+ NBA Playoffs (also the longest active playoff streak in any major North American
50
+ sports league as of 2017). The Spurs have won five NBA championships during the
51
+ streak. The Philadelphia 76ers (formerly known as Syracuse Nationals) hold the
52
+ all-time record for consecutive playoff appearances with 22 straight appearances
53
+ between 1950 and 1971. The 76ers won two NBA championships during their streak.
54
+ The Boston Celtics hold the longest consecutive NBA Finals appearance streak with
55
+ ten appearances between 1957 and 1966. During the streak, the Celtics won eight
56
+ consecutive NBA championships—also an NBA record.
57
+ - Dear Dumb Diary Dear Dumb Diary is a series of children's novels by Jim Benton.
58
+ Each book is written in the first person view of a middle school girl named Jamie
59
+ Kelly. The series is published by Scholastic in English and Random House in Korean.
60
+ Film rights to the series have been optioned by the Gotham Group.[2]
61
+ - Voting rights in the United States Eligibility to vote in the United States is
62
+ established both through the federal constitution and by state law. Several constitutional
63
+ amendments (the 15th, 19th, and 26th specifically) require that voting rights
64
+ cannot be abridged on account of race, color, previous condition of servitude,
65
+ sex, or age for those above 18; the constitution as originally written did not
66
+ establish any such rights during 1787–1870. In the absence of a specific federal
67
+ law or constitutional provision, each state is given considerable discretion to
68
+ establish qualifications for suffrage and candidacy within its own respective
69
+ jurisdiction; in addition, states and lower level jurisdictions establish election
70
+ systems, such as at-large or single member district elections for county councils
71
+ or school boards.
72
+ - source_sentence: who did the vocals on mcdonald's jingle i'm loving it
73
+ sentences:
74
+ - I'm Lovin' It (song) "I'm Lovin' It" is a song recorded by American singer-songwriter
75
+ Justin Timberlake. It was written by Pusha T and produced by The Neptunes.
76
+ - Vallabhbhai Patel As the first Home Minister and Deputy Prime Minister of India,
77
+ Patel organised relief efforts for refugees fleeing from Punjab and Delhi and
78
+ worked to restore peace across the nation. He led the task of forging a united
79
+ India, successfully integrating into the newly independent nation those British
80
+ colonial provinces that had been "allocated" to India. Besides those provinces
81
+ that had been under direct British rule, approximately 565 self-governing princely
82
+ states had been released from British suzerainty by the Indian Independence Act
83
+ of 1947. Employing frank diplomacy with the expressed option to deploy military
84
+ force, Patel persuaded almost every princely state to accede to India. His commitment
85
+ to national integration in the newly independent country was total and uncompromising,
86
+ earning him the sobriquet "Iron Man of India".[3] He is also affectionately remembered
87
+ as the "Patron saint of India's civil servants" for having established the modern
88
+ all-India services system. He is also called the Unifier of India.[4]
89
+ - National debt of the United States As of July 31, 2018, debt held by the public
90
+ was $15.6 trillion and intragovernmental holdings were $5.7 trillion, for a total
91
+ or "National Debt" of $21.3 trillion.[5] Debt held by the public was approximately
92
+ 77% of GDP in 2017, ranked 43rd highest out of 207 countries.[6] The Congressional
93
+ Budget Office forecast in April 2018 that the ratio will rise to nearly 100% by
94
+ 2028, perhaps higher if current policies are extended beyond their scheduled expiration
95
+ date.[7] As of December 2017, $6.3 trillion or approximately 45% of the debt held
96
+ by the public was owned by foreign investors, the largest being China (about $1.18
97
+ trillion) then Japan (about $1.06 trillion).[8]
98
+ - source_sentence: who is the actress of harley quinn in suicide squad
99
+ sentences:
100
+ - Tariffs in United States history Tariffs were the main source of revenue for the
101
+ federal government from 1789 to 1914. During this period, there was vigorous debate
102
+ between the various political parties over the setting of tariff rates. In general
103
+ Democrats favored a tariff that would pay the cost of government, but no higher.
104
+ Whigs and Republicans favored higher tariffs to protect and encourage American
105
+ industry and industrial workers. Since the early 20th century, however, U.S. tariffs
106
+ have been very low and have been much less a matter of partisan debate.
107
+ - The Rolling Stones The Rolling Stones are an English rock band formed in London,
108
+ England in 1962. The first stable line-up consisted of Brian Jones (guitar, harmonica),
109
+ Mick Jagger (lead vocals), Keith Richards (guitar, backing vocals), Bill Wyman
110
+ (bass), Charlie Watts (drums), and Ian Stewart (piano). Stewart was removed from
111
+ the official line-up in 1963 but continued as a touring member until his death
112
+ in 1985. Jones left the band less than a month prior to his death in 1969, having
113
+ already been replaced by Mick Taylor, who remained until 1974. After Taylor left
114
+ the band, Ronnie Wood took his place in 1975 and has been on guitar in tandem
115
+ with Richards ever since. Following Wyman's departure in 1993, Darryl Jones joined
116
+ as their touring bassist. Touring keyboardists for the band have been Nicky Hopkins
117
+ (1967–1982), Ian McLagan (1978–1981), Billy Preston (through the mid-1970s) and
118
+ Chuck Leavell (1982–present). The band was first led by Brian Jones, but after
119
+ developing into the band's songwriters, Jagger and Richards assumed leadership
120
+ while Jones dealt with legal and personal troubles.
121
+ - Margot Robbie After moving to the United States, Robbie starred in the short-lived
122
+ ABC drama series Pan Am (2011–2012). In 2013, she made her big screen debut in
123
+ Richard Curtis's romantic comedy-drama film About Time and co-starred in Martin
124
+ Scorsese's biographical black comedy The Wolf of Wall Street. In 2015, Robbie
125
+ co-starred in the romantic comedy-drama film Focus, appeared in the romantic World
126
+ War II drama film Suite Française and starred in the science fiction film Z for
127
+ Zachariah. That same year, she played herself in The Big Short. In 2016, she portrayed
128
+ Jane Porter in the action-adventure film The Legend of Tarzan and Harley Quinn
129
+ in the superhero film Suicide Squad. She appeared on Time magazine's "The Most
130
+ Influential People of 2017" list.[4]
131
+ - source_sentence: what is meaning of am and pm in time
132
+ sentences:
133
+ - America's Got Talent America's Got Talent (often abbreviated as AGT) is a televised
134
+ American talent show competition, broadcast on the NBC television network. It
135
+ is part of the global Got Talent franchise created by Simon Cowell, and is produced
136
+ by Fremantle North America and SYCOtv, with distribution done by Fremantle. Since
137
+ its premiere in June 2006, each season is run during the network's summer schedule,
138
+ with the show having featured various hosts - it is currently hosted by Tyra Banks,
139
+ since 2017.[2] It is the first global edition of the franchise, after plans for
140
+ a British edition in 2005 were suspended, following a dispute between Paul O'Grady,
141
+ the planned host, and the British broadcaster ITV; production of this edition
142
+ later resumed in 2007.[3]
143
+ - Times Square Times Square is a major commercial intersection, tourist destination,
144
+ entertainment center and neighborhood in the Midtown Manhattan section of New
145
+ York City at the junction of Broadway and Seventh Avenue. It stretches from West
146
+ 42nd to West 47th Streets.[1] Brightly adorned with billboards and advertisements,
147
+ Times Square is sometimes referred to as "The Crossroads of the World",[2] "The
148
+ Center of the Universe",[3] "the heart of The Great White Way",[4][5][6] and the
149
+ "heart of the world".[7] One of the world's busiest pedestrian areas,[8] it is
150
+ also the hub of the Broadway Theater District[9] and a major center of the world's
151
+ entertainment industry.[10] Times Square is one of the world's most visited tourist
152
+ attractions, drawing an estimated 50 million visitors annually.[11] Approximately
153
+ 330,000 people pass through Times Square daily,[12] many of them tourists,[13]
154
+ while over 460,000 pedestrians walk through Times Square on its busiest days.[7]
155
+ - '12-hour clock The 12-hour clock is a time convention in which the 24 hours of
156
+ the day are divided into two periods:[1] a.m. (from the Latin, ante meridiem,
157
+ meaning before midday) and p.m. (post meridiem, meaning past midday).[2] Each
158
+ period consists of 12 hours numbered: 12 (acting as zero),[3] 1, 2, 3, 4, 5, 6,
159
+ 7, 8, 9, 10, and 11. The 24 hour/day cycle starts at 12 midnight (often indicated
160
+ as 12 a.m.), runs through 12 noon (often indicated as 12 p.m.), and continues
161
+ to the midnight at the end of the day. The 12-hour clock was developed over time
162
+ from the mid-second millennium BC to the 16th century AD.'
163
+ datasets:
164
+ - sentence-transformers/natural-questions
165
+ pipeline_tag: sentence-similarity
166
+ library_name: sentence-transformers
167
+ metrics:
168
+ - cosine_accuracy@1
169
+ - cosine_accuracy@3
170
+ - cosine_accuracy@5
171
+ - cosine_accuracy@10
172
+ - cosine_precision@1
173
+ - cosine_precision@3
174
+ - cosine_precision@5
175
+ - cosine_precision@10
176
+ - cosine_recall@1
177
+ - cosine_recall@3
178
+ - cosine_recall@5
179
+ - cosine_recall@10
180
+ - cosine_ndcg@10
181
+ - cosine_mrr@10
182
+ - cosine_map@100
183
+ model-index:
184
+ - name: SentenceTransformer based on answerdotai/ModernBERT-base
185
+ results:
186
+ - task:
187
+ type: information-retrieval
188
+ name: Information Retrieval
189
+ dataset:
190
+ name: NanoMSMARCO
191
+ type: NanoMSMARCO
192
+ metrics:
193
+ - type: cosine_accuracy@1
194
+ value: 0.14
195
+ name: Cosine Accuracy@1
196
+ - type: cosine_accuracy@3
197
+ value: 0.24
198
+ name: Cosine Accuracy@3
199
+ - type: cosine_accuracy@5
200
+ value: 0.3
201
+ name: Cosine Accuracy@5
202
+ - type: cosine_accuracy@10
203
+ value: 0.4
204
+ name: Cosine Accuracy@10
205
+ - type: cosine_precision@1
206
+ value: 0.14
207
+ name: Cosine Precision@1
208
+ - type: cosine_precision@3
209
+ value: 0.07999999999999999
210
+ name: Cosine Precision@3
211
+ - type: cosine_precision@5
212
+ value: 0.06
213
+ name: Cosine Precision@5
214
+ - type: cosine_precision@10
215
+ value: 0.04000000000000001
216
+ name: Cosine Precision@10
217
+ - type: cosine_recall@1
218
+ value: 0.14
219
+ name: Cosine Recall@1
220
+ - type: cosine_recall@3
221
+ value: 0.24
222
+ name: Cosine Recall@3
223
+ - type: cosine_recall@5
224
+ value: 0.3
225
+ name: Cosine Recall@5
226
+ - type: cosine_recall@10
227
+ value: 0.4
228
+ name: Cosine Recall@10
229
+ - type: cosine_ndcg@10
230
+ value: 0.25076046577886124
231
+ name: Cosine Ndcg@10
232
+ - type: cosine_mrr@10
233
+ value: 0.20557936507936506
234
+ name: Cosine Mrr@10
235
+ - type: cosine_map@100
236
+ value: 0.21939187046366332
237
+ name: Cosine Map@100
238
+ - task:
239
+ type: information-retrieval
240
+ name: Information Retrieval
241
+ dataset:
242
+ name: NanoHotpotQA
243
+ type: NanoHotpotQA
244
+ metrics:
245
+ - type: cosine_accuracy@1
246
+ value: 0.14
247
+ name: Cosine Accuracy@1
248
+ - type: cosine_accuracy@3
249
+ value: 0.28
250
+ name: Cosine Accuracy@3
251
+ - type: cosine_accuracy@5
252
+ value: 0.3
253
+ name: Cosine Accuracy@5
254
+ - type: cosine_accuracy@10
255
+ value: 0.36
256
+ name: Cosine Accuracy@10
257
+ - type: cosine_precision@1
258
+ value: 0.14
259
+ name: Cosine Precision@1
260
+ - type: cosine_precision@3
261
+ value: 0.09333333333333332
262
+ name: Cosine Precision@3
263
+ - type: cosine_precision@5
264
+ value: 0.064
265
+ name: Cosine Precision@5
266
+ - type: cosine_precision@10
267
+ value: 0.038
268
+ name: Cosine Precision@10
269
+ - type: cosine_recall@1
270
+ value: 0.07
271
+ name: Cosine Recall@1
272
+ - type: cosine_recall@3
273
+ value: 0.14
274
+ name: Cosine Recall@3
275
+ - type: cosine_recall@5
276
+ value: 0.16
277
+ name: Cosine Recall@5
278
+ - type: cosine_recall@10
279
+ value: 0.19
280
+ name: Cosine Recall@10
281
+ - type: cosine_ndcg@10
282
+ value: 0.15720914647954295
283
+ name: Cosine Ndcg@10
284
+ - type: cosine_mrr@10
285
+ value: 0.2121904761904762
286
+ name: Cosine Mrr@10
287
+ - type: cosine_map@100
288
+ value: 0.12322210117624575
289
+ name: Cosine Map@100
290
+ - task:
291
+ type: nano-beir
292
+ name: Nano BEIR
293
+ dataset:
294
+ name: NanoBEIR mean
295
+ type: NanoBEIR_mean
296
+ metrics:
297
+ - type: cosine_accuracy@1
298
+ value: 0.14
299
+ name: Cosine Accuracy@1
300
+ - type: cosine_accuracy@3
301
+ value: 0.26
302
+ name: Cosine Accuracy@3
303
+ - type: cosine_accuracy@5
304
+ value: 0.3
305
+ name: Cosine Accuracy@5
306
+ - type: cosine_accuracy@10
307
+ value: 0.38
308
+ name: Cosine Accuracy@10
309
+ - type: cosine_precision@1
310
+ value: 0.14
311
+ name: Cosine Precision@1
312
+ - type: cosine_precision@3
313
+ value: 0.08666666666666666
314
+ name: Cosine Precision@3
315
+ - type: cosine_precision@5
316
+ value: 0.062
317
+ name: Cosine Precision@5
318
+ - type: cosine_precision@10
319
+ value: 0.03900000000000001
320
+ name: Cosine Precision@10
321
+ - type: cosine_recall@1
322
+ value: 0.10500000000000001
323
+ name: Cosine Recall@1
324
+ - type: cosine_recall@3
325
+ value: 0.19
326
+ name: Cosine Recall@3
327
+ - type: cosine_recall@5
328
+ value: 0.22999999999999998
329
+ name: Cosine Recall@5
330
+ - type: cosine_recall@10
331
+ value: 0.29500000000000004
332
+ name: Cosine Recall@10
333
+ - type: cosine_ndcg@10
334
+ value: 0.2039848061292021
335
+ name: Cosine Ndcg@10
336
+ - type: cosine_mrr@10
337
+ value: 0.20888492063492065
338
+ name: Cosine Mrr@10
339
+ - type: cosine_map@100
340
+ value: 0.17130698581995454
341
+ name: Cosine Map@100
342
+ ---
343
+
344
+ # SentenceTransformer based on answerdotai/ModernBERT-base
345
+
346
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [answerdotai/ModernBERT-base](https://huggingface.co/answerdotai/ModernBERT-base) on the [natural-questions](https://huggingface.co/datasets/sentence-transformers/natural-questions) dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
347
+
348
+ ## Model Details
349
+
350
+ ### Model Description
351
+ - **Model Type:** Sentence Transformer
352
+ - **Base model:** [answerdotai/ModernBERT-base](https://huggingface.co/answerdotai/ModernBERT-base) <!-- at revision 6e461621ae9e2dffc138de99490e9baee354deb5 -->
353
+ - **Maximum Sequence Length:** 8192 tokens
354
+ - **Output Dimensionality:** 768 dimensions
355
+ - **Similarity Function:** Cosine Similarity
356
+ - **Training Dataset:**
357
+ - [natural-questions](https://huggingface.co/datasets/sentence-transformers/natural-questions)
358
+ - **Language:** en
359
+ <!-- - **License:** Unknown -->
360
+
361
+ ### Model Sources
362
+
363
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
364
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
365
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
366
+
367
+ ### Full Model Architecture
368
+
369
+ ```
370
+ SentenceTransformer(
371
+ (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: ModernBertModel
372
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
373
+ )
374
+ ```
375
+
376
+ ## Usage
377
+
378
+ ### Direct Usage (Sentence Transformers)
379
+
380
+ First install the Sentence Transformers library:
381
+
382
+ ```bash
383
+ pip install -U sentence-transformers
384
+ ```
385
+
386
+ Then you can load this model and run inference.
387
+ ```python
388
+ from sentence_transformers import SentenceTransformer
389
+
390
+ # Download from the 🤗 Hub
391
+ model = SentenceTransformer("tomaarsen/ModernBERT-base-nq-debiased-mnrl")
392
+ # Run inference
393
+ sentences = [
394
+ 'what is meaning of am and pm in time',
395
+ '12-hour clock The 12-hour clock is a time convention in which the 24 hours of the day are divided into two periods:[1] a.m. (from the Latin, ante meridiem, meaning before midday) and p.m. (post meridiem, meaning past midday).[2] Each period consists of 12 hours numbered: 12 (acting as zero),[3] 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, and 11. The 24 hour/day cycle starts at 12 midnight (often indicated as 12 a.m.), runs through 12 noon (often indicated as 12 p.m.), and continues to the midnight at the end of the day. The 12-hour clock was developed over time from the mid-second millennium BC to the 16th century AD.',
396
+ "America's Got Talent America's Got Talent (often abbreviated as AGT) is a televised American talent show competition, broadcast on the NBC television network. It is part of the global Got Talent franchise created by Simon Cowell, and is produced by Fremantle North America and SYCOtv, with distribution done by Fremantle. Since its premiere in June 2006, each season is run during the network's summer schedule, with the show having featured various hosts - it is currently hosted by Tyra Banks, since 2017.[2] It is the first global edition of the franchise, after plans for a British edition in 2005 were suspended, following a dispute between Paul O'Grady, the planned host, and the British broadcaster ITV; production of this edition later resumed in 2007.[3]",
397
+ ]
398
+ embeddings = model.encode(sentences)
399
+ print(embeddings.shape)
400
+ # [3, 768]
401
+
402
+ # Get the similarity scores for the embeddings
403
+ similarities = model.similarity(embeddings, embeddings)
404
+ print(similarities.shape)
405
+ # [3, 3]
406
+ ```
407
+
408
+ <!--
409
+ ### Direct Usage (Transformers)
410
+
411
+ <details><summary>Click to see the direct usage in Transformers</summary>
412
+
413
+ </details>
414
+ -->
415
+
416
+ <!--
417
+ ### Downstream Usage (Sentence Transformers)
418
+
419
+ You can finetune this model on your own dataset.
420
+
421
+ <details><summary>Click to expand</summary>
422
+
423
+ </details>
424
+ -->
425
+
426
+ <!--
427
+ ### Out-of-Scope Use
428
+
429
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
430
+ -->
431
+
432
+ ## Evaluation
433
+
434
+ ### Metrics
435
+
436
+ #### Information Retrieval
437
+
438
+ * Datasets: `NanoMSMARCO` and `NanoHotpotQA`
439
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
440
+
441
+ | Metric | NanoMSMARCO | NanoHotpotQA |
442
+ |:--------------------|:------------|:-------------|
443
+ | cosine_accuracy@1 | 0.14 | 0.14 |
444
+ | cosine_accuracy@3 | 0.24 | 0.28 |
445
+ | cosine_accuracy@5 | 0.3 | 0.3 |
446
+ | cosine_accuracy@10 | 0.4 | 0.36 |
447
+ | cosine_precision@1 | 0.14 | 0.14 |
448
+ | cosine_precision@3 | 0.08 | 0.0933 |
449
+ | cosine_precision@5 | 0.06 | 0.064 |
450
+ | cosine_precision@10 | 0.04 | 0.038 |
451
+ | cosine_recall@1 | 0.14 | 0.07 |
452
+ | cosine_recall@3 | 0.24 | 0.14 |
453
+ | cosine_recall@5 | 0.3 | 0.16 |
454
+ | cosine_recall@10 | 0.4 | 0.19 |
455
+ | **cosine_ndcg@10** | **0.2508** | **0.1572** |
456
+ | cosine_mrr@10 | 0.2056 | 0.2122 |
457
+ | cosine_map@100 | 0.2194 | 0.1232 |
458
+
459
+ #### Nano BEIR
460
+
461
+ * Dataset: `NanoBEIR_mean`
462
+ * Evaluated with [<code>NanoBEIREvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.NanoBEIREvaluator)
463
+
464
+ | Metric | Value |
465
+ |:--------------------|:----------|
466
+ | cosine_accuracy@1 | 0.14 |
467
+ | cosine_accuracy@3 | 0.26 |
468
+ | cosine_accuracy@5 | 0.3 |
469
+ | cosine_accuracy@10 | 0.38 |
470
+ | cosine_precision@1 | 0.14 |
471
+ | cosine_precision@3 | 0.0867 |
472
+ | cosine_precision@5 | 0.062 |
473
+ | cosine_precision@10 | 0.039 |
474
+ | cosine_recall@1 | 0.105 |
475
+ | cosine_recall@3 | 0.19 |
476
+ | cosine_recall@5 | 0.23 |
477
+ | cosine_recall@10 | 0.295 |
478
+ | **cosine_ndcg@10** | **0.204** |
479
+ | cosine_mrr@10 | 0.2089 |
480
+ | cosine_map@100 | 0.1713 |
481
+
482
+ <!--
483
+ ## Bias, Risks and Limitations
484
+
485
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
486
+ -->
487
+
488
+ <!--
489
+ ### Recommendations
490
+
491
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
492
+ -->
493
+
494
+ ## Training Details
495
+
496
+ ### Training Dataset
497
+
498
+ #### natural-questions
499
+
500
+ * Dataset: [natural-questions](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17)
501
+ * Size: 100,231 training samples
502
+ * Columns: <code>query</code> and <code>answer</code>
503
+ * Approximate statistics based on the first 1000 samples:
504
+ | | query | answer |
505
+ |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
506
+ | type | string | string |
507
+ | details | <ul><li>min: 10 tokens</li><li>mean: 12.46 tokens</li><li>max: 25 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 139.02 tokens</li><li>max: 537 tokens</li></ul> |
508
+ * Samples:
509
+ | query | answer |
510
+ |:------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
511
+ | <code>who is required to report according to the hmda</code> | <code>Home Mortgage Disclosure Act US financial institutions must report HMDA data to their regulator if they meet certain criteria, such as having assets above a specific threshold. The criteria is different for depository and non-depository institutions and are available on the FFIEC website.[4] In 2012, there were 7,400 institutions that reported a total of 18.7 million HMDA records.[5]</code> |
512
+ | <code>what is the definition of endoplasmic reticulum in biology</code> | <code>Endoplasmic reticulum The endoplasmic reticulum (ER) is a type of organelle in eukaryotic cells that forms an interconnected network of flattened, membrane-enclosed sacs or tube-like structures known as cisternae. The membranes of the ER are continuous with the outer nuclear membrane. The endoplasmic reticulum occurs in most types of eukaryotic cells, but is absent from red blood cells and spermatozoa. There are two types of endoplasmic reticulum: rough and smooth. The outer (cytosolic) face of the rough endoplasmic reticulum is studded with ribosomes that are the sites of protein synthesis. The rough endoplasmic reticulum is especially prominent in cells such as hepatocytes. The smooth endoplasmic reticulum lacks ribosomes and functions in lipid manufacture and metabolism, the production of steroid hormones, and detoxification.[1] The smooth ER is especially abundant in mammalian liver and gonad cells. The lacy membranes of the endoplasmic reticulum were first seen in 1945 using elect...</code> |
513
+ | <code>what does the ski mean in polish names</code> | <code>Polish name Since the High Middle Ages, Polish-sounding surnames ending with the masculine -ski suffix, including -cki and -dzki, and the corresponding feminine suffix -ska/-cka/-dzka were associated with the nobility (Polish szlachta), which alone, in the early years, had such suffix distinctions.[1] They are widely popular today.</code> |
514
+ * Loss: [<code>DebiasedMultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#debiasedmultiplenegativesrankingloss) with these parameters:
515
+ ```json
516
+ {
517
+ "scale": 1.0,
518
+ "similarity_fct": "cos_sim"
519
+ }
520
+ ```
521
+
522
+ ### Evaluation Dataset
523
+
524
+ #### natural-questions
525
+
526
+ * Dataset: [natural-questions](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17)
527
+ * Size: 100,231 evaluation samples
528
+ * Columns: <code>query</code> and <code>answer</code>
529
+ * Approximate statistics based on the first 1000 samples:
530
+ | | query | answer |
531
+ |:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
532
+ | type | string | string |
533
+ | details | <ul><li>min: 10 tokens</li><li>mean: 12.46 tokens</li><li>max: 22 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 138.0 tokens</li><li>max: 649 tokens</li></ul> |
534
+ * Samples:
535
+ | query | answer |
536
+ |:------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
537
+ | <code>difference between russian blue and british blue cat</code> | <code>Russian Blue The coat is known as a "double coat", with the undercoat being soft, downy and equal in length to the guard hairs, which are an even blue with silver tips. However, the tail may have a few very dull, almost unnoticeable stripes. The coat is described as thick, plush and soft to the touch. The feeling is softer than the softest silk. The silver tips give the coat a shimmering appearance. Its eyes are almost always a dark and vivid green. Any white patches of fur or yellow eyes in adulthood are seen as flaws in show cats.[3] Russian Blues should not be confused with British Blues (which are not a distinct breed, but rather a British Shorthair with a blue coat as the British Shorthair breed itself comes in a wide variety of colors and patterns), nor the Chartreux or Korat which are two other naturally occurring breeds of blue cats, although they have similar traits.</code> |
538
+ | <code>who played the little girl on mrs doubtfire</code> | <code>Mara Wilson Mara Elizabeth Wilson[2] (born July 24, 1987) is an American writer and former child actress. She is known for playing Natalie Hillard in Mrs. Doubtfire (1993), Susan Walker in Miracle on 34th Street (1994), Matilda Wormwood in Matilda (1996) and Lily Stone in Thomas and the Magic Railroad (2000). Since retiring from film acting, Wilson has focused on writing.</code> |
539
+ | <code>what year did the movie the sound of music come out</code> | <code>The Sound of Music (film) The film was released on March 2, 1965 in the United States, initially as a limited roadshow theatrical release. Although critical response to the film was widely mixed, the film was a major commercial success, becoming the number one box office movie after four weeks, and the highest-grossing film of 1965. By November 1966, The Sound of Music had become the highest-grossing film of all-time—surpassing Gone with the Wind—and held that distinction for five years. The film was just as popular throughout the world, breaking previous box-office records in twenty-nine countries. Following an initial theatrical release that lasted four and a half years, and two successful re-releases, the film sold 283 million admissions worldwide and earned a total worldwide gross of $286,000,000.</code> |
540
+ * Loss: [<code>DebiasedMultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#debiasedmultiplenegativesrankingloss) with these parameters:
541
+ ```json
542
+ {
543
+ "scale": 1.0,
544
+ "similarity_fct": "cos_sim"
545
+ }
546
+ ```
547
+
548
+ ### Training Hyperparameters
549
+ #### Non-Default Hyperparameters
550
+
551
+ - `eval_strategy`: steps
552
+ - `per_device_train_batch_size`: 128
553
+ - `per_device_eval_batch_size`: 128
554
+ - `learning_rate`: 8e-05
555
+ - `num_train_epochs`: 1
556
+ - `warmup_ratio`: 0.05
557
+ - `seed`: 12
558
+ - `bf16`: True
559
+ - `batch_sampler`: no_duplicates
560
+
561
+ #### All Hyperparameters
562
+ <details><summary>Click to expand</summary>
563
+
564
+ - `overwrite_output_dir`: False
565
+ - `do_predict`: False
566
+ - `eval_strategy`: steps
567
+ - `prediction_loss_only`: True
568
+ - `per_device_train_batch_size`: 128
569
+ - `per_device_eval_batch_size`: 128
570
+ - `per_gpu_train_batch_size`: None
571
+ - `per_gpu_eval_batch_size`: None
572
+ - `gradient_accumulation_steps`: 1
573
+ - `eval_accumulation_steps`: None
574
+ - `torch_empty_cache_steps`: None
575
+ - `learning_rate`: 8e-05
576
+ - `weight_decay`: 0.0
577
+ - `adam_beta1`: 0.9
578
+ - `adam_beta2`: 0.999
579
+ - `adam_epsilon`: 1e-08
580
+ - `max_grad_norm`: 1.0
581
+ - `num_train_epochs`: 1
582
+ - `max_steps`: -1
583
+ - `lr_scheduler_type`: linear
584
+ - `lr_scheduler_kwargs`: {}
585
+ - `warmup_ratio`: 0.05
586
+ - `warmup_steps`: 0
587
+ - `log_level`: passive
588
+ - `log_level_replica`: warning
589
+ - `log_on_each_node`: True
590
+ - `logging_nan_inf_filter`: True
591
+ - `save_safetensors`: True
592
+ - `save_on_each_node`: False
593
+ - `save_only_model`: False
594
+ - `restore_callback_states_from_checkpoint`: False
595
+ - `no_cuda`: False
596
+ - `use_cpu`: False
597
+ - `use_mps_device`: False
598
+ - `seed`: 12
599
+ - `data_seed`: None
600
+ - `jit_mode_eval`: False
601
+ - `use_ipex`: False
602
+ - `bf16`: True
603
+ - `fp16`: False
604
+ - `fp16_opt_level`: O1
605
+ - `half_precision_backend`: auto
606
+ - `bf16_full_eval`: False
607
+ - `fp16_full_eval`: False
608
+ - `tf32`: None
609
+ - `local_rank`: 0
610
+ - `ddp_backend`: None
611
+ - `tpu_num_cores`: None
612
+ - `tpu_metrics_debug`: False
613
+ - `debug`: []
614
+ - `dataloader_drop_last`: False
615
+ - `dataloader_num_workers`: 0
616
+ - `dataloader_prefetch_factor`: None
617
+ - `past_index`: -1
618
+ - `disable_tqdm`: False
619
+ - `remove_unused_columns`: True
620
+ - `label_names`: None
621
+ - `load_best_model_at_end`: False
622
+ - `ignore_data_skip`: False
623
+ - `fsdp`: []
624
+ - `fsdp_min_num_params`: 0
625
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
626
+ - `fsdp_transformer_layer_cls_to_wrap`: None
627
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
628
+ - `deepspeed`: None
629
+ - `label_smoothing_factor`: 0.0
630
+ - `optim`: adamw_torch
631
+ - `optim_args`: None
632
+ - `adafactor`: False
633
+ - `group_by_length`: False
634
+ - `length_column_name`: length
635
+ - `ddp_find_unused_parameters`: None
636
+ - `ddp_bucket_cap_mb`: None
637
+ - `ddp_broadcast_buffers`: False
638
+ - `dataloader_pin_memory`: True
639
+ - `dataloader_persistent_workers`: False
640
+ - `skip_memory_metrics`: True
641
+ - `use_legacy_prediction_loop`: False
642
+ - `push_to_hub`: False
643
+ - `resume_from_checkpoint`: None
644
+ - `hub_model_id`: None
645
+ - `hub_strategy`: every_save
646
+ - `hub_private_repo`: None
647
+ - `hub_always_push`: False
648
+ - `gradient_checkpointing`: False
649
+ - `gradient_checkpointing_kwargs`: None
650
+ - `include_inputs_for_metrics`: False
651
+ - `include_for_metrics`: []
652
+ - `eval_do_concat_batches`: True
653
+ - `fp16_backend`: auto
654
+ - `push_to_hub_model_id`: None
655
+ - `push_to_hub_organization`: None
656
+ - `mp_parameters`:
657
+ - `auto_find_batch_size`: False
658
+ - `full_determinism`: False
659
+ - `torchdynamo`: None
660
+ - `ray_scope`: last
661
+ - `ddp_timeout`: 1800
662
+ - `torch_compile`: False
663
+ - `torch_compile_backend`: None
664
+ - `torch_compile_mode`: None
665
+ - `dispatch_batches`: None
666
+ - `split_batches`: None
667
+ - `include_tokens_per_second`: False
668
+ - `include_num_input_tokens_seen`: False
669
+ - `neftune_noise_alpha`: None
670
+ - `optim_target_modules`: None
671
+ - `batch_eval_metrics`: False
672
+ - `eval_on_start`: False
673
+ - `use_liger_kernel`: False
674
+ - `eval_use_gather_object`: False
675
+ - `average_tokens_across_devices`: False
676
+ - `prompts`: None
677
+ - `batch_sampler`: no_duplicates
678
+ - `multi_dataset_batch_sampler`: proportional
679
+
680
+ </details>
681
+
682
+ ### Training Logs
683
+ | Epoch | Step | Training Loss | Validation Loss | NanoMSMARCO_cosine_ndcg@10 | NanoHotpotQA_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 |
684
+ |:------:|:----:|:-------------:|:---------------:|:--------------------------:|:---------------------------:|:----------------------------:|
685
+ | 0 | 0 | - | - | 0.0785 | 0.1489 | 0.1137 |
686
+ | 0.0129 | 10 | 4.8033 | - | - | - | - |
687
+ | 0.0258 | 20 | 4.5295 | - | - | - | - |
688
+ | 0.0387 | 30 | 4.2124 | - | - | - | - |
689
+ | 0.0515 | 40 | 4.0863 | - | - | - | - |
690
+ | 0.0644 | 50 | 4.0048 | 3.9563 | 0.1444 | 0.1660 | 0.1552 |
691
+ | 0.0773 | 60 | 3.9686 | - | - | - | - |
692
+ | 0.0902 | 70 | 3.9192 | - | - | - | - |
693
+ | 0.1031 | 80 | 3.9276 | - | - | - | - |
694
+ | 0.1160 | 90 | 3.9104 | - | - | - | - |
695
+ | 0.1289 | 100 | 3.8971 | 3.8877 | 0.2041 | 0.1293 | 0.1667 |
696
+ | 0.1418 | 110 | 3.8987 | - | - | - | - |
697
+ | 0.1546 | 120 | 3.8861 | - | - | - | - |
698
+ | 0.1675 | 130 | 3.8987 | - | - | - | - |
699
+ | 0.1804 | 140 | 3.8811 | - | - | - | - |
700
+ | 0.1933 | 150 | 3.8697 | 3.8478 | 0.1918 | 0.1084 | 0.1501 |
701
+ | 0.2062 | 160 | 3.8621 | - | - | - | - |
702
+ | 0.2191 | 170 | 3.8628 | - | - | - | - |
703
+ | 0.2320 | 180 | 3.8733 | - | - | - | - |
704
+ | 0.2448 | 190 | 3.8551 | - | - | - | - |
705
+ | 0.2577 | 200 | 3.862 | 3.8324 | 0.1940 | 0.0977 | 0.1458 |
706
+ | 0.2706 | 210 | 3.8545 | - | - | - | - |
707
+ | 0.2835 | 220 | 3.8495 | - | - | - | - |
708
+ | 0.2964 | 230 | 3.8459 | - | - | - | - |
709
+ | 0.3093 | 240 | 3.8438 | - | - | - | - |
710
+ | 0.3222 | 250 | 3.8425 | 3.8238 | 0.1933 | 0.1498 | 0.1716 |
711
+ | 0.3351 | 260 | 3.843 | - | - | - | - |
712
+ | 0.3479 | 270 | 3.8486 | - | - | - | - |
713
+ | 0.3608 | 280 | 3.8409 | - | - | - | - |
714
+ | 0.3737 | 290 | 3.8345 | - | - | - | - |
715
+ | 0.3866 | 300 | 3.8446 | 3.8154 | 0.1937 | 0.1532 | 0.1735 |
716
+ | 0.3995 | 310 | 3.8281 | - | - | - | - |
717
+ | 0.4124 | 320 | 3.8316 | - | - | - | - |
718
+ | 0.4253 | 330 | 3.8325 | - | - | - | - |
719
+ | 0.4381 | 340 | 3.8298 | - | - | - | - |
720
+ | 0.4510 | 350 | 3.8379 | 3.8104 | 0.1690 | 0.1559 | 0.1624 |
721
+ | 0.4639 | 360 | 3.821 | - | - | - | - |
722
+ | 0.4768 | 370 | 3.8297 | - | - | - | - |
723
+ | 0.4897 | 380 | 3.8206 | - | - | - | - |
724
+ | 0.5026 | 390 | 3.8222 | - | - | - | - |
725
+ | 0.5155 | 400 | 3.8243 | 3.8031 | 0.2141 | 0.1544 | 0.1843 |
726
+ | 0.5284 | 410 | 3.8328 | - | - | - | - |
727
+ | 0.5412 | 420 | 3.8211 | - | - | - | - |
728
+ | 0.5541 | 430 | 3.82 | - | - | - | - |
729
+ | 0.5670 | 440 | 3.8167 | - | - | - | - |
730
+ | 0.5799 | 450 | 3.8062 | 3.7988 | 0.2281 | 0.1392 | 0.1837 |
731
+ | 0.5928 | 460 | 3.8166 | - | - | - | - |
732
+ | 0.6057 | 470 | 3.8164 | - | - | - | - |
733
+ | 0.6186 | 480 | 3.8207 | - | - | - | - |
734
+ | 0.6314 | 490 | 3.815 | - | - | - | - |
735
+ | 0.6443 | 500 | 3.813 | 3.7943 | 0.2381 | 0.1260 | 0.1821 |
736
+ | 0.6572 | 510 | 3.8144 | - | - | - | - |
737
+ | 0.6701 | 520 | 3.8172 | - | - | - | - |
738
+ | 0.6830 | 530 | 3.8175 | - | - | - | - |
739
+ | 0.6959 | 540 | 3.8126 | - | - | - | - |
740
+ | 0.7088 | 550 | 3.8077 | 3.7913 | 0.2501 | 0.1395 | 0.1948 |
741
+ | 0.7216 | 560 | 3.8022 | - | - | - | - |
742
+ | 0.7345 | 570 | 3.8131 | - | - | - | - |
743
+ | 0.7474 | 580 | 3.8067 | - | - | - | - |
744
+ | 0.7603 | 590 | 3.8175 | - | - | - | - |
745
+ | 0.7732 | 600 | 3.8084 | 3.7870 | 0.2751 | 0.1480 | 0.2116 |
746
+ | 0.7861 | 610 | 3.8029 | - | - | - | - |
747
+ | 0.7990 | 620 | 3.8125 | - | - | - | - |
748
+ | 0.8119 | 630 | 3.817 | - | - | - | - |
749
+ | 0.8247 | 640 | 3.8038 | - | - | - | - |
750
+ | 0.8376 | 650 | 3.8054 | 3.7877 | 0.2274 | 0.1449 | 0.1861 |
751
+ | 0.8505 | 660 | 3.8041 | - | - | - | - |
752
+ | 0.8634 | 670 | 3.8012 | - | - | - | - |
753
+ | 0.8763 | 680 | 3.8117 | - | - | - | - |
754
+ | 0.8892 | 690 | 3.8098 | - | - | - | - |
755
+ | 0.9021 | 700 | 3.8008 | 3.7848 | 0.2466 | 0.1551 | 0.2008 |
756
+ | 0.9149 | 710 | 3.8038 | - | - | - | - |
757
+ | 0.9278 | 720 | 3.7949 | - | - | - | - |
758
+ | 0.9407 | 730 | 3.8044 | - | - | - | - |
759
+ | 0.9536 | 740 | 3.7982 | - | - | - | - |
760
+ | 0.9665 | 750 | 3.804 | 3.7832 | 0.2585 | 0.1587 | 0.2086 |
761
+ | 0.9794 | 760 | 3.8038 | - | - | - | - |
762
+ | 0.9923 | 770 | 3.8046 | - | - | - | - |
763
+ | 1.0 | 776 | - | - | 0.2508 | 0.1572 | 0.2040 |
764
+
765
+
766
+ ### Framework Versions
767
+ - Python: 3.11.10
768
+ - Sentence Transformers: 3.4.0.dev0
769
+ - Transformers: 4.48.0.dev0
770
+ - PyTorch: 2.6.0.dev20241112+cu121
771
+ - Accelerate: 1.2.0
772
+ - Datasets: 3.2.0
773
+ - Tokenizers: 0.21.0
774
+
775
+ ## Citation
776
+
777
+ ### BibTeX
778
+
779
+ #### Sentence Transformers
780
+ ```bibtex
781
+ @inproceedings{reimers-2019-sentence-bert,
782
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
783
+ author = "Reimers, Nils and Gurevych, Iryna",
784
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
785
+ month = "11",
786
+ year = "2019",
787
+ publisher = "Association for Computational Linguistics",
788
+ url = "https://arxiv.org/abs/1908.10084",
789
+ }
790
+ ```
791
+
792
+ #### DebiasedMultipleNegativesRankingLoss
793
+ ```bibtex
794
+ @inproceedings{chuang2020debiased,
795
+ title={Debiased Contrastive Learning},
796
+ author={Ching-Yao Chuang and Joshua Robinson and Lin Yen-Chen and Antonio Torralba and Stefanie Jegelka},
797
+ booktitle={Advances in Neural Information Processing Systems},
798
+ year={2020},
799
+ url={https://arxiv.org/pdf/2007.00224}
800
+ }
801
+ ```
802
+
803
+ <!--
804
+ ## Glossary
805
+
806
+ *Clearly define terms in order to be accessible across audiences.*
807
+ -->
808
+
809
+ <!--
810
+ ## Model Card Authors
811
+
812
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
813
+ -->
814
+
815
+ <!--
816
+ ## Model Card Contact
817
+
818
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
819
+ -->
config.json ADDED
@@ -0,0 +1,46 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "answerdotai/ModernBERT-base",
3
+ "architectures": [
4
+ "ModernBertModel"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 50281,
9
+ "classifier_activation": "gelu",
10
+ "classifier_bias": false,
11
+ "classifier_dropout": 0.0,
12
+ "classifier_pooling": "mean",
13
+ "cls_token_id": 50281,
14
+ "decoder_bias": true,
15
+ "deterministic_flash_attn": false,
16
+ "embedding_dropout": 0.0,
17
+ "eos_token_id": 50282,
18
+ "global_attn_every_n_layers": 3,
19
+ "global_rope_theta": 160000.0,
20
+ "gradient_checkpointing": false,
21
+ "hidden_activation": "gelu",
22
+ "hidden_size": 768,
23
+ "initializer_cutoff_factor": 2.0,
24
+ "initializer_range": 0.02,
25
+ "intermediate_size": 1152,
26
+ "layer_norm_eps": 1e-05,
27
+ "local_attention": 128,
28
+ "local_rope_theta": 10000.0,
29
+ "max_position_embeddings": 8192,
30
+ "mlp_bias": false,
31
+ "mlp_dropout": 0.0,
32
+ "model_type": "modernbert",
33
+ "norm_bias": false,
34
+ "norm_eps": 1e-05,
35
+ "num_attention_heads": 12,
36
+ "num_hidden_layers": 22,
37
+ "pad_token_id": 50283,
38
+ "position_embedding_type": "absolute",
39
+ "reference_compile": true,
40
+ "sep_token_id": 50282,
41
+ "sparse_pred_ignore_index": -100,
42
+ "sparse_prediction": false,
43
+ "torch_dtype": "float32",
44
+ "transformers_version": "4.48.0.dev0",
45
+ "vocab_size": 50368
46
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.4.0.dev0",
4
+ "transformers": "4.48.0.dev0",
5
+ "pytorch": "2.6.0.dev20241112+cu121"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9b79f66daa2ff58390d78fc0b6a8603804d39ffae94faa713bd876ec3b7db59f
3
+ size 596070136
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 8192,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": true,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "[PAD]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "sep_token": {
24
+ "content": "[SEP]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "unk_token": {
31
+ "content": "[UNK]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ }
37
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,945 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "|||IP_ADDRESS|||",
5
+ "lstrip": false,
6
+ "normalized": true,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": false
10
+ },
11
+ "1": {
12
+ "content": "<|padding|>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "50254": {
20
+ "content": " ",
21
+ "lstrip": false,
22
+ "normalized": true,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": false
26
+ },
27
+ "50255": {
28
+ "content": " ",
29
+ "lstrip": false,
30
+ "normalized": true,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": false
34
+ },
35
+ "50256": {
36
+ "content": " ",
37
+ "lstrip": false,
38
+ "normalized": true,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": false
42
+ },
43
+ "50257": {
44
+ "content": " ",
45
+ "lstrip": false,
46
+ "normalized": true,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": false
50
+ },
51
+ "50258": {
52
+ "content": " ",
53
+ "lstrip": false,
54
+ "normalized": true,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": false
58
+ },
59
+ "50259": {
60
+ "content": " ",
61
+ "lstrip": false,
62
+ "normalized": true,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": false
66
+ },
67
+ "50260": {
68
+ "content": " ",
69
+ "lstrip": false,
70
+ "normalized": true,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": false
74
+ },
75
+ "50261": {
76
+ "content": " ",
77
+ "lstrip": false,
78
+ "normalized": true,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": false
82
+ },
83
+ "50262": {
84
+ "content": " ",
85
+ "lstrip": false,
86
+ "normalized": true,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": false
90
+ },
91
+ "50263": {
92
+ "content": " ",
93
+ "lstrip": false,
94
+ "normalized": true,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": false
98
+ },
99
+ "50264": {
100
+ "content": " ",
101
+ "lstrip": false,
102
+ "normalized": true,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": false
106
+ },
107
+ "50265": {
108
+ "content": " ",
109
+ "lstrip": false,
110
+ "normalized": true,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": false
114
+ },
115
+ "50266": {
116
+ "content": " ",
117
+ "lstrip": false,
118
+ "normalized": true,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": false
122
+ },
123
+ "50267": {
124
+ "content": " ",
125
+ "lstrip": false,
126
+ "normalized": true,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": false
130
+ },
131
+ "50268": {
132
+ "content": " ",
133
+ "lstrip": false,
134
+ "normalized": true,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": false
138
+ },
139
+ "50269": {
140
+ "content": " ",
141
+ "lstrip": false,
142
+ "normalized": true,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": false
146
+ },
147
+ "50270": {
148
+ "content": " ",
149
+ "lstrip": false,
150
+ "normalized": true,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": false
154
+ },
155
+ "50271": {
156
+ "content": " ",
157
+ "lstrip": false,
158
+ "normalized": true,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": false
162
+ },
163
+ "50272": {
164
+ "content": " ",
165
+ "lstrip": false,
166
+ "normalized": true,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": false
170
+ },
171
+ "50273": {
172
+ "content": " ",
173
+ "lstrip": false,
174
+ "normalized": true,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": false
178
+ },
179
+ "50274": {
180
+ "content": " ",
181
+ "lstrip": false,
182
+ "normalized": true,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": false
186
+ },
187
+ "50275": {
188
+ "content": " ",
189
+ "lstrip": false,
190
+ "normalized": true,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": false
194
+ },
195
+ "50276": {
196
+ "content": " ",
197
+ "lstrip": false,
198
+ "normalized": true,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": false
202
+ },
203
+ "50277": {
204
+ "content": "|||EMAIL_ADDRESS|||",
205
+ "lstrip": false,
206
+ "normalized": true,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": false
210
+ },
211
+ "50278": {
212
+ "content": "|||PHONE_NUMBER|||",
213
+ "lstrip": false,
214
+ "normalized": true,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": false
218
+ },
219
+ "50279": {
220
+ "content": "<|endoftext|>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "50280": {
228
+ "content": "[UNK]",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "50281": {
236
+ "content": "[CLS]",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "50282": {
244
+ "content": "[SEP]",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "50283": {
252
+ "content": "[PAD]",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "50284": {
260
+ "content": "[MASK]",
261
+ "lstrip": true,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "50285": {
268
+ "content": "[unused0]",
269
+ "lstrip": false,
270
+ "normalized": true,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": false
274
+ },
275
+ "50286": {
276
+ "content": "[unused1]",
277
+ "lstrip": false,
278
+ "normalized": true,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": false
282
+ },
283
+ "50287": {
284
+ "content": "[unused2]",
285
+ "lstrip": false,
286
+ "normalized": true,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": false
290
+ },
291
+ "50288": {
292
+ "content": "[unused3]",
293
+ "lstrip": false,
294
+ "normalized": true,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": false
298
+ },
299
+ "50289": {
300
+ "content": "[unused4]",
301
+ "lstrip": false,
302
+ "normalized": true,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": false
306
+ },
307
+ "50290": {
308
+ "content": "[unused5]",
309
+ "lstrip": false,
310
+ "normalized": true,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": false
314
+ },
315
+ "50291": {
316
+ "content": "[unused6]",
317
+ "lstrip": false,
318
+ "normalized": true,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": false
322
+ },
323
+ "50292": {
324
+ "content": "[unused7]",
325
+ "lstrip": false,
326
+ "normalized": true,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": false
330
+ },
331
+ "50293": {
332
+ "content": "[unused8]",
333
+ "lstrip": false,
334
+ "normalized": true,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": false
338
+ },
339
+ "50294": {
340
+ "content": "[unused9]",
341
+ "lstrip": false,
342
+ "normalized": true,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": false
346
+ },
347
+ "50295": {
348
+ "content": "[unused10]",
349
+ "lstrip": false,
350
+ "normalized": true,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": false
354
+ },
355
+ "50296": {
356
+ "content": "[unused11]",
357
+ "lstrip": false,
358
+ "normalized": true,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": false
362
+ },
363
+ "50297": {
364
+ "content": "[unused12]",
365
+ "lstrip": false,
366
+ "normalized": true,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": false
370
+ },
371
+ "50298": {
372
+ "content": "[unused13]",
373
+ "lstrip": false,
374
+ "normalized": true,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": false
378
+ },
379
+ "50299": {
380
+ "content": "[unused14]",
381
+ "lstrip": false,
382
+ "normalized": true,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": false
386
+ },
387
+ "50300": {
388
+ "content": "[unused15]",
389
+ "lstrip": false,
390
+ "normalized": true,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": false
394
+ },
395
+ "50301": {
396
+ "content": "[unused16]",
397
+ "lstrip": false,
398
+ "normalized": true,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": false
402
+ },
403
+ "50302": {
404
+ "content": "[unused17]",
405
+ "lstrip": false,
406
+ "normalized": true,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": false
410
+ },
411
+ "50303": {
412
+ "content": "[unused18]",
413
+ "lstrip": false,
414
+ "normalized": true,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": false
418
+ },
419
+ "50304": {
420
+ "content": "[unused19]",
421
+ "lstrip": false,
422
+ "normalized": true,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": false
426
+ },
427
+ "50305": {
428
+ "content": "[unused20]",
429
+ "lstrip": false,
430
+ "normalized": true,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": false
434
+ },
435
+ "50306": {
436
+ "content": "[unused21]",
437
+ "lstrip": false,
438
+ "normalized": true,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": false
442
+ },
443
+ "50307": {
444
+ "content": "[unused22]",
445
+ "lstrip": false,
446
+ "normalized": true,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": false
450
+ },
451
+ "50308": {
452
+ "content": "[unused23]",
453
+ "lstrip": false,
454
+ "normalized": true,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": false
458
+ },
459
+ "50309": {
460
+ "content": "[unused24]",
461
+ "lstrip": false,
462
+ "normalized": true,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": false
466
+ },
467
+ "50310": {
468
+ "content": "[unused25]",
469
+ "lstrip": false,
470
+ "normalized": true,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": false
474
+ },
475
+ "50311": {
476
+ "content": "[unused26]",
477
+ "lstrip": false,
478
+ "normalized": true,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": false
482
+ },
483
+ "50312": {
484
+ "content": "[unused27]",
485
+ "lstrip": false,
486
+ "normalized": true,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": false
490
+ },
491
+ "50313": {
492
+ "content": "[unused28]",
493
+ "lstrip": false,
494
+ "normalized": true,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": false
498
+ },
499
+ "50314": {
500
+ "content": "[unused29]",
501
+ "lstrip": false,
502
+ "normalized": true,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": false
506
+ },
507
+ "50315": {
508
+ "content": "[unused30]",
509
+ "lstrip": false,
510
+ "normalized": true,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": false
514
+ },
515
+ "50316": {
516
+ "content": "[unused31]",
517
+ "lstrip": false,
518
+ "normalized": true,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": false
522
+ },
523
+ "50317": {
524
+ "content": "[unused32]",
525
+ "lstrip": false,
526
+ "normalized": true,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": false
530
+ },
531
+ "50318": {
532
+ "content": "[unused33]",
533
+ "lstrip": false,
534
+ "normalized": true,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": false
538
+ },
539
+ "50319": {
540
+ "content": "[unused34]",
541
+ "lstrip": false,
542
+ "normalized": true,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": false
546
+ },
547
+ "50320": {
548
+ "content": "[unused35]",
549
+ "lstrip": false,
550
+ "normalized": true,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": false
554
+ },
555
+ "50321": {
556
+ "content": "[unused36]",
557
+ "lstrip": false,
558
+ "normalized": true,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": false
562
+ },
563
+ "50322": {
564
+ "content": "[unused37]",
565
+ "lstrip": false,
566
+ "normalized": true,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": false
570
+ },
571
+ "50323": {
572
+ "content": "[unused38]",
573
+ "lstrip": false,
574
+ "normalized": true,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": false
578
+ },
579
+ "50324": {
580
+ "content": "[unused39]",
581
+ "lstrip": false,
582
+ "normalized": true,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": false
586
+ },
587
+ "50325": {
588
+ "content": "[unused40]",
589
+ "lstrip": false,
590
+ "normalized": true,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": false
594
+ },
595
+ "50326": {
596
+ "content": "[unused41]",
597
+ "lstrip": false,
598
+ "normalized": true,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": false
602
+ },
603
+ "50327": {
604
+ "content": "[unused42]",
605
+ "lstrip": false,
606
+ "normalized": true,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": false
610
+ },
611
+ "50328": {
612
+ "content": "[unused43]",
613
+ "lstrip": false,
614
+ "normalized": true,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": false
618
+ },
619
+ "50329": {
620
+ "content": "[unused44]",
621
+ "lstrip": false,
622
+ "normalized": true,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": false
626
+ },
627
+ "50330": {
628
+ "content": "[unused45]",
629
+ "lstrip": false,
630
+ "normalized": true,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": false
634
+ },
635
+ "50331": {
636
+ "content": "[unused46]",
637
+ "lstrip": false,
638
+ "normalized": true,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": false
642
+ },
643
+ "50332": {
644
+ "content": "[unused47]",
645
+ "lstrip": false,
646
+ "normalized": true,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": false
650
+ },
651
+ "50333": {
652
+ "content": "[unused48]",
653
+ "lstrip": false,
654
+ "normalized": true,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": false
658
+ },
659
+ "50334": {
660
+ "content": "[unused49]",
661
+ "lstrip": false,
662
+ "normalized": true,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": false
666
+ },
667
+ "50335": {
668
+ "content": "[unused50]",
669
+ "lstrip": false,
670
+ "normalized": true,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": false
674
+ },
675
+ "50336": {
676
+ "content": "[unused51]",
677
+ "lstrip": false,
678
+ "normalized": true,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": false
682
+ },
683
+ "50337": {
684
+ "content": "[unused52]",
685
+ "lstrip": false,
686
+ "normalized": true,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": false
690
+ },
691
+ "50338": {
692
+ "content": "[unused53]",
693
+ "lstrip": false,
694
+ "normalized": true,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": false
698
+ },
699
+ "50339": {
700
+ "content": "[unused54]",
701
+ "lstrip": false,
702
+ "normalized": true,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": false
706
+ },
707
+ "50340": {
708
+ "content": "[unused55]",
709
+ "lstrip": false,
710
+ "normalized": true,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": false
714
+ },
715
+ "50341": {
716
+ "content": "[unused56]",
717
+ "lstrip": false,
718
+ "normalized": true,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": false
722
+ },
723
+ "50342": {
724
+ "content": "[unused57]",
725
+ "lstrip": false,
726
+ "normalized": true,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": false
730
+ },
731
+ "50343": {
732
+ "content": "[unused58]",
733
+ "lstrip": false,
734
+ "normalized": true,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": false
738
+ },
739
+ "50344": {
740
+ "content": "[unused59]",
741
+ "lstrip": false,
742
+ "normalized": true,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": false
746
+ },
747
+ "50345": {
748
+ "content": "[unused60]",
749
+ "lstrip": false,
750
+ "normalized": true,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": false
754
+ },
755
+ "50346": {
756
+ "content": "[unused61]",
757
+ "lstrip": false,
758
+ "normalized": true,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": false
762
+ },
763
+ "50347": {
764
+ "content": "[unused62]",
765
+ "lstrip": false,
766
+ "normalized": true,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": false
770
+ },
771
+ "50348": {
772
+ "content": "[unused63]",
773
+ "lstrip": false,
774
+ "normalized": true,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": false
778
+ },
779
+ "50349": {
780
+ "content": "[unused64]",
781
+ "lstrip": false,
782
+ "normalized": true,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": false
786
+ },
787
+ "50350": {
788
+ "content": "[unused65]",
789
+ "lstrip": false,
790
+ "normalized": true,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": false
794
+ },
795
+ "50351": {
796
+ "content": "[unused66]",
797
+ "lstrip": false,
798
+ "normalized": true,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": false
802
+ },
803
+ "50352": {
804
+ "content": "[unused67]",
805
+ "lstrip": false,
806
+ "normalized": true,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": false
810
+ },
811
+ "50353": {
812
+ "content": "[unused68]",
813
+ "lstrip": false,
814
+ "normalized": true,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": false
818
+ },
819
+ "50354": {
820
+ "content": "[unused69]",
821
+ "lstrip": false,
822
+ "normalized": true,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": false
826
+ },
827
+ "50355": {
828
+ "content": "[unused70]",
829
+ "lstrip": false,
830
+ "normalized": true,
831
+ "rstrip": false,
832
+ "single_word": false,
833
+ "special": false
834
+ },
835
+ "50356": {
836
+ "content": "[unused71]",
837
+ "lstrip": false,
838
+ "normalized": true,
839
+ "rstrip": false,
840
+ "single_word": false,
841
+ "special": false
842
+ },
843
+ "50357": {
844
+ "content": "[unused72]",
845
+ "lstrip": false,
846
+ "normalized": true,
847
+ "rstrip": false,
848
+ "single_word": false,
849
+ "special": false
850
+ },
851
+ "50358": {
852
+ "content": "[unused73]",
853
+ "lstrip": false,
854
+ "normalized": true,
855
+ "rstrip": false,
856
+ "single_word": false,
857
+ "special": false
858
+ },
859
+ "50359": {
860
+ "content": "[unused74]",
861
+ "lstrip": false,
862
+ "normalized": true,
863
+ "rstrip": false,
864
+ "single_word": false,
865
+ "special": false
866
+ },
867
+ "50360": {
868
+ "content": "[unused75]",
869
+ "lstrip": false,
870
+ "normalized": true,
871
+ "rstrip": false,
872
+ "single_word": false,
873
+ "special": false
874
+ },
875
+ "50361": {
876
+ "content": "[unused76]",
877
+ "lstrip": false,
878
+ "normalized": true,
879
+ "rstrip": false,
880
+ "single_word": false,
881
+ "special": false
882
+ },
883
+ "50362": {
884
+ "content": "[unused77]",
885
+ "lstrip": false,
886
+ "normalized": true,
887
+ "rstrip": false,
888
+ "single_word": false,
889
+ "special": false
890
+ },
891
+ "50363": {
892
+ "content": "[unused78]",
893
+ "lstrip": false,
894
+ "normalized": true,
895
+ "rstrip": false,
896
+ "single_word": false,
897
+ "special": false
898
+ },
899
+ "50364": {
900
+ "content": "[unused79]",
901
+ "lstrip": false,
902
+ "normalized": true,
903
+ "rstrip": false,
904
+ "single_word": false,
905
+ "special": false
906
+ },
907
+ "50365": {
908
+ "content": "[unused80]",
909
+ "lstrip": false,
910
+ "normalized": true,
911
+ "rstrip": false,
912
+ "single_word": false,
913
+ "special": false
914
+ },
915
+ "50366": {
916
+ "content": "[unused81]",
917
+ "lstrip": false,
918
+ "normalized": true,
919
+ "rstrip": false,
920
+ "single_word": false,
921
+ "special": false
922
+ },
923
+ "50367": {
924
+ "content": "[unused82]",
925
+ "lstrip": false,
926
+ "normalized": true,
927
+ "rstrip": false,
928
+ "single_word": false,
929
+ "special": false
930
+ }
931
+ },
932
+ "clean_up_tokenization_spaces": true,
933
+ "cls_token": "[CLS]",
934
+ "extra_special_tokens": {},
935
+ "mask_token": "[MASK]",
936
+ "model_input_names": [
937
+ "input_ids",
938
+ "attention_mask"
939
+ ],
940
+ "model_max_length": 1000000000000000019884624838656,
941
+ "pad_token": "[PAD]",
942
+ "sep_token": "[SEP]",
943
+ "tokenizer_class": "PreTrainedTokenizerFast",
944
+ "unk_token": "[UNK]"
945
+ }