--- language: - yue license: apache-2.0 tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:129371 - loss:CachedGISTEmbedLoss base_model: hon9kon9ize/bert-large-cantonese-sts widget: - source_sentence: 'query: is ampulla of vater part of the pancreas' sentences: - 'document: Ampulla of Vater The ampulla of Vater, also known as the hepatopancreatic ampulla or the hepatopancreatic duct, is formed by the union of the pancreatic duct and the common bile duct. The ampulla is specifically located at the major duodenal papilla.' - 'document: 抗凝加化疗;化疗' - 'document: Daylight saving time in Australia Daylight saving was first used in Australia during World War I, and was applied in all states. It was used again during the Second World War. A drought in Tasmania in 1967 led to the reintroduction of daylight saving in that state during the summer, and this was repeated every summer since then. In 1971, New South Wales, Victoria,[16] Queensland, South Australia, and the Australian Capital Territory followed Tasmania by observing daylight saving. Western Australia and the Northern Territory did not. Queensland abandoned daylight saving time in 1972.[17]' - source_sentence: 'query: henry''s law states that the solubility of a gas in a liquid' sentences: - 'document: Henry''s law In chemistry, Henry''s law is a gas law that states that the amount of dissolved gas is proportional to its partial pressure in the gas phase. The proportionality factor is called the Henry''s law constant. It was formulated by the English chemist William Henry, who studied the topic in the early 19th century. In his publication about the quantity of gases absorbed by water,[1] he described the results of his experiments:' - 'document: Saint Stephen''s Day Saint Stephen''s Day, or the Feast of Saint Stephen, is a Christian saint''s day to commemorate Saint Stephen, the first Christian martyr or protomartyr, celebrated on 26 December in the Latin Church and 27 December in Eastern Christianity. The Eastern Orthodox Church adheres to the Julian calendar and mark Saint Stephen''s Day on 27 December according to that calendar, which places it on 9 January of the Gregorian calendar used in secular contexts. In Latin Christian denominations, Saint Stephen''s Day marks the second day of Christmastide.[1][2]' - 'document: American Revolutionary War The American Revolutionary War (1775–1783), also known as the American War of Independence,[40] was a global war that began as a conflict between Great Britain and its Thirteen Colonies which declared independence as the United States of America.[N 1]' - source_sentence: 'query: what is the plot of american horror story hotel' sentences: - 'document: American Horror Story: Hotel The plot centers around the enigmatic Hotel Cortez in Los Angeles, California, that catches the eye of an intrepid homicide detective (Bentley). The Cortez is host to the strange and bizarre, spearheaded by its owner, The Countess (Gaga), who is a bloodsucking fashionista. The hotel is loosely based on an actual hotel built in 1893 by H. H. Holmes in Chicago, Il. for the 1893 World''s Columbian Exposition. It became known as the ''Murder Castle'' as it was built for Holmes to torture, murder, and dispose of evidence just as is the Cortez. This season features two murderous threats in the form of the Ten Commandments Killer, a serial offender who selects his victims in accordance with biblical teachings, and "the Addiction Demon", who roams the hotel armed with a drill bit dildo.' - 'document: Book of Job Rabbinic tradition ascribes the authorship of Job to Moses, but scholars generally agree that it was written between the 7th and 4th centuries BCE, with the 6th century BCE as the most likely period for various reasons.[17] The anonymous author was almost certainly an Israelite, although he has set his story outside Israel, in southern Edom or northern Arabia, and makes allusion to places as far apart as Mesopotamia and Egypt.[18] According to the 6th-century BCE prophet Ezekiel, Job was a man of antiquity renowned for his righteousness,[19] and the book''s author has chosen this legendary hero for his parable.[20]' - 'document: Galešnjak Galešnjak (also called Island of Love, Lover''s Island, Otok za zaljubljene) is located in the Pašman channel of the Adriatic, between the islands of Pašman and the town of Turanj on mainland Croatia. It is one of the world''s few naturally occurring heart-shaped objects such as the Heart Reef in the Whitsundays.' - source_sentence: 'query: what historical event inspired wollstonecraft''s book a vindication of the rights of woman' sentences: - 'document: 銀河嘅獨特外形自古以嚟就引起人類嘅幻想。例如中國就有牛郎織女嘅故事,相傳身為人類嘅牛郎同身為仙女嘅織女相遇並且墮入愛河,但因為人仙相戀犯天規而俾天界阻止,王母娘娘變條銀河出嚟分隔佢哋,限佢哋淨係喺每年嘅農曆七月初七先可以喺條鵲橋上面相會-呢個傳說就係傳統節日七姐誕嘅起源。' - 'document: Rock Star (2001 film) The singing voice for Wahlberg''s character was provided by Steelheart frontman Miljenko Matijevic for the Steel Dragon Songs, the final number was dubbed by Brian Vander Ark. Jeff Scott Soto (of Talisman, Yngwie Malmsteen, Soul SirkUS, and Journey) provided the voice of the singer Wahlberg''s character replaces. Kennedy is the only actor whose actual voice is used.[citation needed]. Ralph Saenz (Steel Panther) also appears briefly, as the singer auditioning ahead of Chris at the studio.' - 'document: A Vindication of the Rights of Woman Wollstonecraft was prompted to write the Rights of Woman after reading Charles Maurice de Talleyrand-Périgord''s 1791 report to the French National Assembly, which stated that women should only receive a domestic education; she used her commentary on this specific event to launch a broad attack against sexual double standards and to indict men for encouraging women to indulge in excessive emotion. Wollstonecraft wrote the Rights of Woman hurriedly to respond directly to ongoing events; she intended to write a more thoughtful second volume but died before completing it.' - source_sentence: 'query: when did england change from fahrenheit to celsius' sentences: - 'document: Periodic table Importantly, the organization of the periodic table can be utilized to derive relationships between various element properties, but also predicted chemical properties and behaviours of undiscovered or newly synthesized elements. Russian chemist Dmitri Mendeleev was first to publish a recognizable periodic table in 1869, developed mainly to illustrate periodic trends of the then-known elements. He also predicted some properties of unidentified elements that were expected to fill gaps within this table. Most of his forecasts proved to be correct. Mendeleev''s idea has been slowly expanded and refined with the discovery or synthesis of further new elements and by developing new theoretical models to explain chemical behaviour. The modern periodic table now provides a useful framework for analyzing chemical reactions, and continues to be widely adopted in chemistry, nuclear physics and other sciences.' - 'document: How to Train Your Dragon (franchise) The How to Train Your Dragon franchise from DreamWorks Animation consists of two feature films How to Train Your Dragon (2010) and How to Train Your Dragon 2 (2014), with a third feature film, How to Train Your Dragon: The Hidden World, set for a 2019 release. The franchise is inspired by the British book series of the same name by Cressida Cowell. The franchise also consists of four short films: Legend of the Boneknapper Dragon (2010), Book of Dragons (2011), Gift of the Night Fury (2011) and Dawn of the Dragon Racers (2014). A television series following the events of the first film, Dragons: Riders of Berk, began airing on Cartoon Network in September 2012. Its second season was renamed Dragons: Defenders of Berk. Set several years later, and as a more immediate prequel to the second film, a new television series, titled Dragons: Race to the Edge, aired on Netflix in June 2015.[1] The second season of the show was added to Netflix in January 2016 and a third season in June 2016. A fourth season aired on Netflix in February 2017, a fifth season in August 2017, and a sixth and final season on February 16, 2018.' - 'document: Metrication in the United Kingdom Adopting the metric system was discussed in Parliament as early as 1818 and some industries and even some government agencies had metricated, or were in the process of metricating by the mid 1960s. A formal government policy to support metrication was agreed by 1965. This policy, initiated in response to requests from industry, was to support voluntary metrication, with costs picked up where they fell. In 1969 the government created the Metrication Board as a quango to promote and coordinate metrication. In 1978, after some carpet retailers reverted to pricing by the square yard rather than the square metre, government policy shifted, and they started issuing orders making metrication mandatory in certain sectors. In 1980 government policy shifted again to prefer voluntary metrication, and the Metrication Board was abolished. By the time the Metrication Board was wound up, all the economic sectors that fell within its remit except road signage and parts of the retail trade sector had metricated.' pipeline_tag: sentence-similarity library_name: sentence-transformers metrics: - cosine_accuracy@1 - cosine_accuracy@3 - cosine_accuracy@5 - cosine_accuracy@10 - cosine_precision@1 - cosine_precision@3 - cosine_precision@5 - cosine_precision@10 - cosine_recall@1 - cosine_recall@3 - cosine_recall@5 - cosine_recall@10 - cosine_ndcg@10 - cosine_mrr@10 - cosine_map@100 model-index: - name: Bert base fine-tuned with Cantonese and English mixed STS dataset results: - task: type: information-retrieval name: Information Retrieval dataset: name: NanoClimateFEVER type: NanoClimateFEVER metrics: - type: cosine_accuracy@1 value: 0.06 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.2 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.22 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.26 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.06 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.06666666666666667 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.05200000000000001 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.032 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.035 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.105 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.12666666666666665 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.14400000000000002 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.10738523976006756 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.12305555555555553 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.08386746046821102 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: NanoDBPedia type: NanoDBPedia metrics: - type: cosine_accuracy@1 value: 0.1 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.26 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.44 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.52 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.1 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.12666666666666665 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.15200000000000002 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.154 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.005776685612719247 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.025711996601987995 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.04879480020144454 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.08175565470928514 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.1564753058784049 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.22302380952380954 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.08481993410477483 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: NanoFEVER type: NanoFEVER metrics: - type: cosine_accuracy@1 value: 0.06 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.1 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.1 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.12 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.06 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.03333333333333333 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.02 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.012000000000000002 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.05 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.09 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.09 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.11 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.07804424038166692 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.07533333333333334 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.07658274436198606 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: NanoFiQA2018 type: NanoFiQA2018 metrics: - type: cosine_accuracy@1 value: 0.12 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.22 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.26 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.36 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.12 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.07999999999999999 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.064 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.046000000000000006 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.07085714285714287 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.13621428571428573 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.14993650793650792 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.21193650793650792 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.15989208858068493 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.18794444444444444 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.1278932041519149 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: NanoHotpotQA type: NanoHotpotQA metrics: - type: cosine_accuracy@1 value: 0.18 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.38 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.4 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.44 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.18 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.13333333333333333 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.084 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.05200000000000001 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.09 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.2 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.21 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.26 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.21524243911000313 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.2793333333333333 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.16949818775802034 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: NanoMSMARCO type: NanoMSMARCO metrics: - type: cosine_accuracy@1 value: 0.08 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.16 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.2 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.24 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.08 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.05333333333333333 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.04 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.024000000000000004 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.08 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.16 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.2 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.24 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.155021218726892 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.12816666666666665 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.14387227309213746 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: NanoNFCorpus type: NanoNFCorpus metrics: - type: cosine_accuracy@1 value: 0.1 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.1 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.12 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.18 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.1 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.06 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.05600000000000001 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.042 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.0023944899556066555 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.004511202133435534 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.005335271278326478 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.006887081773042016 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.0513758550014842 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.11271428571428571 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.011178329865269043 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: NanoNQ type: NanoNQ metrics: - type: cosine_accuracy@1 value: 0.12 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.26 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.38 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.44 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.12 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.08666666666666666 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.08 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.04800000000000001 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.11 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.24 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.37 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.43 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.26691470842049086 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.21954761904761902 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.22127704921258506 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: NanoQuoraRetrieval type: NanoQuoraRetrieval metrics: - type: cosine_accuracy@1 value: 0.56 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.66 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.68 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.8 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.56 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.25333333333333335 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.16 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.092 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.49 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.6073333333333334 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.634 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.7406666666666666 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.6315714749064664 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.6265555555555555 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.6007758177607536 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: NanoSCIDOCS type: NanoSCIDOCS metrics: - type: cosine_accuracy@1 value: 0.06 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.12 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.14 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.22 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.06 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.05333333333333333 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.036000000000000004 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.026000000000000002 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.015666666666666666 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.03666666666666667 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.04066666666666666 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.05666666666666667 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.05444580189319236 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.10085714285714287 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.03825732082321992 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: NanoArguAna type: NanoArguAna metrics: - type: cosine_accuracy@1 value: 0.12 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.34 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.52 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.64 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.12 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.11333333333333333 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.10400000000000001 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.064 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.12 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.34 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.52 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.64 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.36676045848370026 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.2815 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.28967419376346565 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: NanoSciFact type: NanoSciFact metrics: - type: cosine_accuracy@1 value: 0.18 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.22 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.32 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.36 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.18 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.07999999999999999 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.068 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.04 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.165 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.21 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.3 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.345 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.24854556538285397 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.22416666666666665 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.23077037853195492 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: NanoTouche2020 type: NanoTouche2020 metrics: - type: cosine_accuracy@1 value: 0.3469387755102041 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.7142857142857143 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.7959183673469388 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.9387755102040817 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.3469387755102041 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.32653061224489793 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.30612244897959184 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.2714285714285714 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.01725883684742171 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.06000832753846316 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.10128699807186763 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.17048580946181527 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.29344650277463163 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.5436912860382248 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.18279928418932134 name: Cosine Map@100 - task: type: nano-beir name: Nano BEIR dataset: name: NanoBEIR mean type: NanoBEIR_mean metrics: - type: cosine_accuracy@1 value: 0.1605337519623234 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.2872527472527473 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.35199372056514916 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.42452119309262165 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.1605337519623234 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.11281004709576138 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.0940094191522763 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.0694945054945055 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.09630414014919672 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.17041890861447484 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.21512976237088305 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.2644152605549218 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.21424006917696453 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.2404530537489721 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.17394355216027804 name: Cosine Map@100 --- # Bert base fine-tuned with Cantonese and English mixed STS dataset This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [hon9kon9ize/bert-large-cantonese-sts](https://huggingface.co./hon9kon9ize/bert-large-cantonese-sts). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [hon9kon9ize/bert-large-cantonese-sts](https://huggingface.co./hon9kon9ize/bert-large-cantonese-sts) - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 1024 dimensions - **Similarity Function:** Cosine Similarity - **Language:** yue - **License:** apache-2.0 ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co./models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("hon9kon9ize/yue-embed") # Run inference sentences = [ 'query: when did england change from fahrenheit to celsius', 'document: Metrication in the United Kingdom Adopting the metric system was discussed in Parliament as early as 1818 and some industries and even some government agencies had metricated, or were in the process of metricating by the mid 1960s. A formal government policy to support metrication was agreed by 1965. This policy, initiated in response to requests from industry, was to support voluntary metrication, with costs picked up where they fell. In 1969 the government created the Metrication Board as a quango to promote and coordinate metrication. In 1978, after some carpet retailers reverted to pricing by the square yard rather than the square metre, government policy shifted, and they started issuing orders making metrication mandatory in certain sectors. In 1980 government policy shifted again to prefer voluntary metrication, and the Metrication Board was abolished. By the time the Metrication Board was wound up, all the economic sectors that fell within its remit except road signage and parts of the retail trade sector had metricated.', "document: Periodic table Importantly, the organization of the periodic table can be utilized to derive relationships between various element properties, but also predicted chemical properties and behaviours of undiscovered or newly synthesized elements. Russian chemist Dmitri Mendeleev was first to publish a recognizable periodic table in 1869, developed mainly to illustrate periodic trends of the then-known elements. He also predicted some properties of unidentified elements that were expected to fill gaps within this table. Most of his forecasts proved to be correct. Mendeleev's idea has been slowly expanded and refined with the discovery or synthesis of further new elements and by developing new theoretical models to explain chemical behaviour. The modern periodic table now provides a useful framework for analyzing chemical reactions, and continues to be widely adopted in chemistry, nuclear physics and other sciences.", ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 1024] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` ## Evaluation ### Metrics #### Information Retrieval * Datasets: `NanoClimateFEVER`, `NanoDBPedia`, `NanoFEVER`, `NanoFiQA2018`, `NanoHotpotQA`, `NanoMSMARCO`, `NanoNFCorpus`, `NanoNQ`, `NanoQuoraRetrieval`, `NanoSCIDOCS`, `NanoArguAna`, `NanoSciFact` and `NanoTouche2020` * Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | NanoClimateFEVER | NanoDBPedia | NanoFEVER | NanoFiQA2018 | NanoHotpotQA | NanoMSMARCO | NanoNFCorpus | NanoNQ | NanoQuoraRetrieval | NanoSCIDOCS | NanoArguAna | NanoSciFact | NanoTouche2020 | |:--------------------|:-----------------|:------------|:----------|:-------------|:-------------|:------------|:-------------|:-----------|:-------------------|:------------|:------------|:------------|:---------------| | cosine_accuracy@1 | 0.06 | 0.1 | 0.06 | 0.12 | 0.18 | 0.08 | 0.1 | 0.12 | 0.56 | 0.06 | 0.12 | 0.18 | 0.3469 | | cosine_accuracy@3 | 0.2 | 0.26 | 0.1 | 0.22 | 0.38 | 0.16 | 0.1 | 0.26 | 0.66 | 0.12 | 0.34 | 0.22 | 0.7143 | | cosine_accuracy@5 | 0.22 | 0.44 | 0.1 | 0.26 | 0.4 | 0.2 | 0.12 | 0.38 | 0.68 | 0.14 | 0.52 | 0.32 | 0.7959 | | cosine_accuracy@10 | 0.26 | 0.52 | 0.12 | 0.36 | 0.44 | 0.24 | 0.18 | 0.44 | 0.8 | 0.22 | 0.64 | 0.36 | 0.9388 | | cosine_precision@1 | 0.06 | 0.1 | 0.06 | 0.12 | 0.18 | 0.08 | 0.1 | 0.12 | 0.56 | 0.06 | 0.12 | 0.18 | 0.3469 | | cosine_precision@3 | 0.0667 | 0.1267 | 0.0333 | 0.08 | 0.1333 | 0.0533 | 0.06 | 0.0867 | 0.2533 | 0.0533 | 0.1133 | 0.08 | 0.3265 | | cosine_precision@5 | 0.052 | 0.152 | 0.02 | 0.064 | 0.084 | 0.04 | 0.056 | 0.08 | 0.16 | 0.036 | 0.104 | 0.068 | 0.3061 | | cosine_precision@10 | 0.032 | 0.154 | 0.012 | 0.046 | 0.052 | 0.024 | 0.042 | 0.048 | 0.092 | 0.026 | 0.064 | 0.04 | 0.2714 | | cosine_recall@1 | 0.035 | 0.0058 | 0.05 | 0.0709 | 0.09 | 0.08 | 0.0024 | 0.11 | 0.49 | 0.0157 | 0.12 | 0.165 | 0.0173 | | cosine_recall@3 | 0.105 | 0.0257 | 0.09 | 0.1362 | 0.2 | 0.16 | 0.0045 | 0.24 | 0.6073 | 0.0367 | 0.34 | 0.21 | 0.06 | | cosine_recall@5 | 0.1267 | 0.0488 | 0.09 | 0.1499 | 0.21 | 0.2 | 0.0053 | 0.37 | 0.634 | 0.0407 | 0.52 | 0.3 | 0.1013 | | cosine_recall@10 | 0.144 | 0.0818 | 0.11 | 0.2119 | 0.26 | 0.24 | 0.0069 | 0.43 | 0.7407 | 0.0567 | 0.64 | 0.345 | 0.1705 | | **cosine_ndcg@10** | **0.1074** | **0.1565** | **0.078** | **0.1599** | **0.2152** | **0.155** | **0.0514** | **0.2669** | **0.6316** | **0.0544** | **0.3668** | **0.2485** | **0.2934** | | cosine_mrr@10 | 0.1231 | 0.223 | 0.0753 | 0.1879 | 0.2793 | 0.1282 | 0.1127 | 0.2195 | 0.6266 | 0.1009 | 0.2815 | 0.2242 | 0.5437 | | cosine_map@100 | 0.0839 | 0.0848 | 0.0766 | 0.1279 | 0.1695 | 0.1439 | 0.0112 | 0.2213 | 0.6008 | 0.0383 | 0.2897 | 0.2308 | 0.1828 | #### Nano BEIR * Dataset: `NanoBEIR_mean` * Evaluated with [NanoBEIREvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.NanoBEIREvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.1605 | | cosine_accuracy@3 | 0.2873 | | cosine_accuracy@5 | 0.352 | | cosine_accuracy@10 | 0.4245 | | cosine_precision@1 | 0.1605 | | cosine_precision@3 | 0.1128 | | cosine_precision@5 | 0.094 | | cosine_precision@10 | 0.0695 | | cosine_recall@1 | 0.0963 | | cosine_recall@3 | 0.1704 | | cosine_recall@5 | 0.2151 | | cosine_recall@10 | 0.2644 | | **cosine_ndcg@10** | **0.2142** | | cosine_mrr@10 | 0.2405 | | cosine_map@100 | 0.1739 | ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 129,371 training samples * Columns: query and answer * Approximate statistics based on the first 1000 samples: | | query | answer | |:--------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | query | answer | |:--------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | query: hotel and restaurant employees and bartenders international union | document: Hotel Employees and Restaurant Employees Union The Hotel Employees and Restaurant Employees Union (HERE) was a United States labor union representing workers of the hospitality industry, formed in 1891. In 2004, HERE merged with the Union of Needletrades, Industrial, and Textile Employees (UNITE) to form UNITE HERE. HERE notably organized the staff of Yale University in 1984. Other major employers that contracted with this union included several large casinos (Harrah's, Caesars Palace, and Wynn Resorts); hotels (Hilton, Hyatt and Starwood), and Walt Disney World. HERE was affiliated with the AFL-CIO. | | query: 多肢离断伤的并发症是什么? | document: 失血性休克;血循环危象;急性肾功能衰竭 | | query: who is the father of kelly taylor's son on 90210 | document: Kelly Taylor (90210) In 2008, Kelly Taylor returned in the spin-off 90210, now working as a guidance counselor at her alma mater West Beverly Hills High School. It was revealed that in the intervening years, she attained a master's degree and had a son named Sammy with Dylan. She and Dylan ended their relationship soon after. It was also revealed that West Beverly principal Harry Wilson was Kelly's neighbor growing up.[39] | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: NewModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.01} ``` ### Evaluation Dataset #### Unnamed Dataset * Size: 1,000 evaluation samples * Columns: query and answer * Approximate statistics based on the first 1000 samples: | | query | answer | |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | query | answer | |:-----------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | query: 微创经皮肾镜手术的推荐药有些什么? | document: 阿司匹林 | | query: why are the fires in ca called the thomas fires | document: Thomas Fire On December 4, 2017, the Thomas Fire was reported at 6:26 p.m. PST,[36] to the north of Santa Paula, near Steckel Park and Thomas Aquinas College,[3][24] after which the fire is named.[37] That night, the small brush fire exploded in size and raced through the rugged mountain terrain that lies west of Santa Paula, between Ventura and Ojai.[19][38] Officials blamed strong Santa Ana winds that gusted up to 60 miles per hour (97 km/h) for the sudden expansion.[28][39] Soon after the fire had started, a second blaze was ignited nearly 30 minutes later, about 4 miles (6.4 km) to the north in Upper Ojai at the top of Koenigstein Road.[40] According to eyewitnesses, this second fire was sparked by an explosion in the power line over the area. The second fire was rapidly expanded by the strong Santa Ana winds, and soon merged into the Thomas Fire later that night.[40] | | query: which mountain man rediscovered south pass and brought back important information about this trail | document: Jedediah Smith Jedediah Strong Smith (January 6, 1799 – May 27, 1831), was a clerk, frontiersman, hunter, trapper, author, cartographer, and explorer of the Rocky Mountains, the North American West, and the Southwest during the early 19th century. After 75 years of obscurity following his death, Smith was rediscovered as the American whose explorations led to the use of the 20-mile (32 km)-wide South Pass as the dominant point of crossing the Continental Divide for pioneers on the Oregon Trail. | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: NewModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.01} ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 128 - `per_device_eval_batch_size`: 128 - `learning_rate`: 2e-05 - `num_train_epochs`: 2 - `warmup_ratio`: 0.05 - `seed`: 12 - `bf16`: True - `prompts`: {'query': 'query: ', 'answer': 'document: '} - `batch_sampler`: no_duplicates #### All Hyperparameters
Click to expand - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 128 - `per_device_eval_batch_size`: 128 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 2 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.05 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 12 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: True - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: None - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `include_for_metrics`: [] - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `use_liger_kernel`: False - `eval_use_gather_object`: False - `average_tokens_across_devices`: False - `prompts`: {'query': 'query: ', 'answer': 'document: '} - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional
### Training Logs
Click to expand | Epoch | Step | Training Loss | Validation Loss | NanoClimateFEVER_cosine_ndcg@10 | NanoDBPedia_cosine_ndcg@10 | NanoFEVER_cosine_ndcg@10 | NanoFiQA2018_cosine_ndcg@10 | NanoHotpotQA_cosine_ndcg@10 | NanoMSMARCO_cosine_ndcg@10 | NanoNFCorpus_cosine_ndcg@10 | NanoNQ_cosine_ndcg@10 | NanoQuoraRetrieval_cosine_ndcg@10 | NanoSCIDOCS_cosine_ndcg@10 | NanoArguAna_cosine_ndcg@10 | NanoSciFact_cosine_ndcg@10 | NanoTouche2020_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 | |:------:|:----:|:-------------:|:---------------:|:-------------------------------:|:--------------------------:|:------------------------:|:---------------------------:|:---------------------------:|:--------------------------:|:---------------------------:|:---------------------:|:---------------------------------:|:--------------------------:|:--------------------------:|:--------------------------:|:-----------------------------:|:----------------------------:| | 0.0010 | 1 | 31.7042 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0049 | 5 | 32.9433 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0099 | 10 | 27.0338 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0148 | 15 | 18.1598 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0198 | 20 | 12.5771 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0247 | 25 | 8.6872 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0297 | 30 | 6.0455 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0346 | 35 | 5.1917 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0396 | 40 | 4.8424 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0445 | 45 | 4.4785 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0495 | 50 | 4.1896 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0544 | 55 | 4.2621 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0593 | 60 | 3.8401 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0643 | 65 | 3.9482 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0692 | 70 | 3.7762 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0742 | 75 | 3.4895 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0791 | 80 | 3.5892 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0841 | 85 | 3.5312 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0890 | 90 | 3.3244 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0940 | 95 | 3.4369 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0989 | 100 | 3.1867 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1039 | 105 | 3.1734 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1088 | 110 | 3.2156 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1137 | 115 | 2.8888 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1187 | 120 | 2.8613 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1236 | 125 | 2.8905 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1286 | 130 | 2.5984 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1335 | 135 | 2.6853 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1385 | 140 | 2.7013 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1434 | 145 | 2.5577 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1484 | 150 | 2.6287 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1533 | 155 | 2.6481 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1583 | 160 | 2.7741 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1632 | 165 | 2.5738 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1682 | 170 | 2.5335 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1731 | 175 | 2.531 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1780 | 180 | 2.437 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1830 | 185 | 2.4836 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1879 | 190 | 2.4642 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1929 | 195 | 2.399 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1978 | 200 | 2.3896 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2028 | 205 | 2.3738 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2077 | 210 | 2.5518 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2127 | 215 | 2.4836 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2176 | 220 | 2.2157 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2226 | 225 | 2.2986 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2275 | 230 | 2.4967 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2324 | 235 | 2.121 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2374 | 240 | 2.4301 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2423 | 245 | 2.5054 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2473 | 250 | 2.3213 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2522 | 255 | 2.1182 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2572 | 260 | 2.2966 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2621 | 265 | 2.2662 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2671 | 270 | 2.3188 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2720 | 275 | 2.1836 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2770 | 280 | 2.2206 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2819 | 285 | 2.3144 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2868 | 290 | 2.2496 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2918 | 295 | 1.9909 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2967 | 300 | 2.1294 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3017 | 305 | 2.119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3066 | 310 | 2.0076 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3116 | 315 | 2.127 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3165 | 320 | 2.1309 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3215 | 325 | 2.0868 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3264 | 330 | 1.9429 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3314 | 335 | 1.9 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3363 | 340 | 1.82 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3412 | 345 | 1.9731 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3462 | 350 | 2.0156 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3511 | 355 | 2.0106 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3561 | 360 | 1.9383 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3610 | 365 | 2.0491 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3660 | 370 | 1.8893 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3709 | 375 | 1.958 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3759 | 380 | 1.9821 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3808 | 385 | 2.024 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3858 | 390 | 2.0182 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3907 | 395 | 1.9659 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.3956 | 400 | 1.8339 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4006 | 405 | 1.9081 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4055 | 410 | 1.7876 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4105 | 415 | 1.8371 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4154 | 420 | 1.8274 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4204 | 425 | 1.7863 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4253 | 430 | 1.9064 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4303 | 435 | 1.7721 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4352 | 440 | 1.7162 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4402 | 445 | 1.9112 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4451 | 450 | 1.9384 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4500 | 455 | 1.8096 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4550 | 460 | 1.7145 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4599 | 465 | 1.784 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4649 | 470 | 1.9506 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4698 | 475 | 1.7243 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4748 | 480 | 1.8003 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4797 | 485 | 1.7568 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4847 | 490 | 1.5696 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4896 | 495 | 1.8973 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4946 | 500 | 1.6981 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.4995 | 505 | 1.7616 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5045 | 510 | 1.6573 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5094 | 515 | 1.8685 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5143 | 520 | 1.8532 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5193 | 525 | 1.7603 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5242 | 530 | 1.7636 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5292 | 535 | 1.4829 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5341 | 540 | 1.6959 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5391 | 545 | 1.6389 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5440 | 550 | 1.6624 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5490 | 555 | 1.8193 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5539 | 560 | 1.7144 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5589 | 565 | 1.4954 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5638 | 570 | 1.6659 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5687 | 575 | 1.669 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5737 | 580 | 1.6931 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5786 | 585 | 1.6894 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5836 | 590 | 1.6437 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5885 | 595 | 1.7259 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5935 | 600 | 1.7937 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.5984 | 605 | 1.7279 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6034 | 610 | 1.6769 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6083 | 615 | 1.4731 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6133 | 620 | 1.6466 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6182 | 625 | 1.6954 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6231 | 630 | 1.6224 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6281 | 635 | 1.62 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6330 | 640 | 1.5795 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6380 | 645 | 1.5245 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6429 | 650 | 1.7629 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6479 | 655 | 1.5767 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6528 | 660 | 1.6749 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6578 | 665 | 1.5602 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6627 | 670 | 1.6768 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6677 | 675 | 1.8311 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6726 | 680 | 1.5973 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6775 | 685 | 1.5066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6825 | 690 | 1.6036 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6874 | 695 | 1.7857 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6924 | 700 | 1.4387 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.6973 | 705 | 1.5886 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7023 | 710 | 1.551 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7072 | 715 | 1.5561 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7122 | 720 | 1.4458 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7171 | 725 | 1.5703 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7221 | 730 | 1.6162 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7270 | 735 | 1.5643 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7319 | 740 | 1.4894 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7369 | 745 | 1.6413 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7418 | 750 | 1.5406 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7468 | 755 | 1.5185 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7517 | 760 | 1.488 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7567 | 765 | 1.5041 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7616 | 770 | 1.4665 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7666 | 775 | 1.5252 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7715 | 780 | 1.4925 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7765 | 785 | 1.3833 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7814 | 790 | 1.3808 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7864 | 795 | 1.5468 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7913 | 800 | 1.5317 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.7962 | 805 | 1.5385 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8012 | 810 | 1.4012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8061 | 815 | 1.5531 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8111 | 820 | 1.6032 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8160 | 825 | 1.4053 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8210 | 830 | 1.5082 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8259 | 835 | 1.5559 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8309 | 840 | 1.4286 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8358 | 845 | 1.4336 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8408 | 850 | 1.3731 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8457 | 855 | 1.5706 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8506 | 860 | 1.4184 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8556 | 865 | 1.4312 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8605 | 870 | 1.4364 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8655 | 875 | 1.5605 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8704 | 880 | 1.4219 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8754 | 885 | 1.4082 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8803 | 890 | 1.3846 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8853 | 895 | 1.4292 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8902 | 900 | 1.4195 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.8952 | 905 | 1.5103 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9001 | 910 | 1.5041 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9050 | 915 | 1.427 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9100 | 920 | 1.4385 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9149 | 925 | 1.298 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9199 | 930 | 1.4499 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9248 | 935 | 1.4752 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9298 | 940 | 1.4752 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9347 | 945 | 1.3705 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9397 | 950 | 1.4567 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9446 | 955 | 1.3364 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9496 | 960 | 1.376 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9545 | 965 | 1.35 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9594 | 970 | 1.5841 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9644 | 975 | 1.3449 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9693 | 980 | 1.2132 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9743 | 985 | 1.3414 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9792 | 990 | 1.5148 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9842 | 995 | 1.3866 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9891 | 1000 | 1.2051 | 1.3370 | 0.0906 | 0.1578 | 0.0712 | 0.1504 | 0.1887 | 0.1554 | 0.0466 | 0.2528 | 0.6197 | 0.0672 | 0.2857 | 0.2291 | 0.2718 | 0.1990 | | 0.9941 | 1005 | 1.3021 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.9990 | 1010 | 1.391 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0040 | 1015 | 1.1452 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0089 | 1020 | 1.3989 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0138 | 1025 | 1.2142 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0188 | 1030 | 1.2472 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0237 | 1035 | 1.3058 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0287 | 1040 | 1.2643 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0336 | 1045 | 1.2581 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0386 | 1050 | 1.2434 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0435 | 1055 | 1.1874 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0485 | 1060 | 1.0421 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0534 | 1065 | 1.3834 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0584 | 1070 | 1.3279 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0633 | 1075 | 1.3779 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0682 | 1080 | 1.3071 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0732 | 1085 | 1.1569 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0781 | 1090 | 1.2427 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0831 | 1095 | 1.1607 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0880 | 1100 | 1.2691 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0930 | 1105 | 1.2936 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.0979 | 1110 | 1.2527 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1029 | 1115 | 1.1143 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1078 | 1120 | 1.1508 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1128 | 1125 | 1.1627 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1177 | 1130 | 0.9774 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1227 | 1135 | 1.1827 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1276 | 1140 | 0.9429 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1325 | 1145 | 1.0029 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1375 | 1150 | 1.0764 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1424 | 1155 | 1.0555 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1474 | 1160 | 1.0559 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1523 | 1165 | 1.0081 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1573 | 1170 | 1.1928 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1622 | 1175 | 1.0774 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1672 | 1180 | 0.9185 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1721 | 1185 | 1.0838 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1771 | 1190 | 0.9981 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1820 | 1195 | 1.0395 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1869 | 1200 | 0.9522 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1919 | 1205 | 0.9652 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.1968 | 1210 | 1.0276 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2018 | 1215 | 0.9663 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2067 | 1220 | 1.1356 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2117 | 1225 | 1.159 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2166 | 1230 | 0.8575 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2216 | 1235 | 0.9134 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2265 | 1240 | 1.1889 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2315 | 1245 | 0.935 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2364 | 1250 | 0.975 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2413 | 1255 | 1.073 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2463 | 1260 | 1.0709 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2512 | 1265 | 0.9241 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2562 | 1270 | 1.0101 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2611 | 1275 | 1.1451 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2661 | 1280 | 1.0501 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2710 | 1285 | 0.9724 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2760 | 1290 | 0.9222 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2809 | 1295 | 1.086 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2859 | 1300 | 0.973 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2908 | 1305 | 0.9287 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.2957 | 1310 | 0.9051 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3007 | 1315 | 0.9531 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3056 | 1320 | 0.9605 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3106 | 1325 | 0.8778 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3155 | 1330 | 0.9399 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3205 | 1335 | 0.9185 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3254 | 1340 | 0.9078 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3304 | 1345 | 0.8266 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3353 | 1350 | 0.8186 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3403 | 1355 | 0.9394 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3452 | 1360 | 1.0972 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3501 | 1365 | 0.8895 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3551 | 1370 | 0.8678 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3600 | 1375 | 0.9493 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3650 | 1380 | 0.8449 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3699 | 1385 | 0.917 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3749 | 1390 | 0.8899 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3798 | 1395 | 0.9516 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3848 | 1400 | 0.9538 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3897 | 1405 | 0.9964 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3947 | 1410 | 0.9123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.3996 | 1415 | 0.86 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4045 | 1420 | 0.9382 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4095 | 1425 | 0.764 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4144 | 1430 | 0.9161 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4194 | 1435 | 0.937 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4243 | 1440 | 0.8487 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4293 | 1445 | 0.7928 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4342 | 1450 | 0.8586 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4392 | 1455 | 0.9355 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4441 | 1460 | 0.965 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4491 | 1465 | 0.9019 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4540 | 1470 | 0.8624 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4590 | 1475 | 0.8204 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4639 | 1480 | 1.0131 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4688 | 1485 | 0.9222 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4738 | 1490 | 0.9182 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4787 | 1495 | 0.8247 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4837 | 1500 | 0.7746 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4886 | 1505 | 0.882 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4936 | 1510 | 0.8482 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.4985 | 1515 | 0.9623 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5035 | 1520 | 0.8804 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5084 | 1525 | 0.8874 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5134 | 1530 | 0.9747 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5183 | 1535 | 0.8805 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5232 | 1540 | 0.8776 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5282 | 1545 | 0.7627 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5331 | 1550 | 0.8975 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5381 | 1555 | 0.8213 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5430 | 1560 | 0.9472 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5480 | 1565 | 0.9379 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5529 | 1570 | 0.9312 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5579 | 1575 | 0.7866 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5628 | 1580 | 0.8629 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5678 | 1585 | 0.8156 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5727 | 1590 | 0.8737 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5776 | 1595 | 0.942 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5826 | 1600 | 0.8167 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5875 | 1605 | 0.9468 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5925 | 1610 | 0.9117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.5974 | 1615 | 1.0137 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6024 | 1620 | 0.8357 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6073 | 1625 | 0.8372 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6123 | 1630 | 0.905 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6172 | 1635 | 0.9265 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6222 | 1640 | 0.846 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6271 | 1645 | 0.7729 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6320 | 1650 | 0.7885 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6370 | 1655 | 0.8717 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6419 | 1660 | 0.9845 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6469 | 1665 | 0.8286 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6518 | 1670 | 0.8979 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6568 | 1675 | 0.8502 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6617 | 1680 | 0.9423 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6667 | 1685 | 1.0128 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6716 | 1690 | 0.8535 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6766 | 1695 | 0.737 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6815 | 1700 | 0.9871 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6864 | 1705 | 0.8828 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6914 | 1710 | 0.8178 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.6963 | 1715 | 0.7703 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7013 | 1720 | 0.8739 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7062 | 1725 | 0.8582 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7112 | 1730 | 0.9181 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7161 | 1735 | 0.8801 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7211 | 1740 | 0.8009 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7260 | 1745 | 0.9779 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7310 | 1750 | 0.7777 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7359 | 1755 | 0.7864 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7409 | 1760 | 1.0066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7458 | 1765 | 0.7776 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7507 | 1770 | 0.8122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7557 | 1775 | 0.8025 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7606 | 1780 | 0.7559 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7656 | 1785 | 0.8819 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7705 | 1790 | 0.8901 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7755 | 1795 | 0.7598 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7804 | 1800 | 0.7542 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7854 | 1805 | 0.8178 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7903 | 1810 | 0.8374 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.7953 | 1815 | 0.8363 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8002 | 1820 | 0.8177 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8051 | 1825 | 0.9488 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8101 | 1830 | 0.9959 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8150 | 1835 | 0.7942 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8200 | 1840 | 0.8747 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8249 | 1845 | 0.9053 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8299 | 1850 | 0.7853 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8348 | 1855 | 0.838 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8398 | 1860 | 0.7732 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8447 | 1865 | 0.8613 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8497 | 1870 | 0.791 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8546 | 1875 | 0.8203 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8595 | 1880 | 0.7558 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8645 | 1885 | 0.9918 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8694 | 1890 | 0.8272 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8744 | 1895 | 0.8552 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8793 | 1900 | 0.8135 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8843 | 1905 | 0.8297 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8892 | 1910 | 0.7844 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8942 | 1915 | 0.8466 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.8991 | 1920 | 0.9099 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.9041 | 1925 | 0.8139 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.9090 | 1930 | 0.8628 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.9139 | 1935 | 0.6778 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.9189 | 1940 | 0.8251 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.9238 | 1945 | 0.8915 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.9288 | 1950 | 0.8136 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.9337 | 1955 | 0.8879 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.9387 | 1960 | 0.8758 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.9436 | 1965 | 0.8153 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.9486 | 1970 | 0.7253 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.9535 | 1975 | 0.8493 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.9585 | 1980 | 1.0186 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.9634 | 1985 | 0.8412 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.9683 | 1990 | 0.7027 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.9733 | 1995 | 0.744 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.9782 | 2000 | 0.9555 | 1.1452 | 0.1064 | 0.1577 | 0.0780 | 0.1597 | 0.2144 | 0.1550 | 0.0513 | 0.2643 | 0.6316 | 0.0525 | 0.3670 | 0.2485 | 0.2937 | 0.2139 | | 1.9832 | 2005 | 0.9095 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.9881 | 2010 | 0.7378 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.9931 | 2015 | 0.8024 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 1.9980 | 2020 | 0.9107 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 2.0 | 2022 | - | - | 0.1074 | 0.1565 | 0.0780 | 0.1599 | 0.2152 | 0.1550 | 0.0514 | 0.2669 | 0.6316 | 0.0544 | 0.3668 | 0.2485 | 0.2934 | 0.2142 |
### Framework Versions - Python: 3.11.2 - Sentence Transformers: 3.3.1 - Transformers: 4.47.1 - PyTorch: 2.4.0+cu121 - Accelerate: 1.0.1 - Datasets: 3.1.0 - Tokenizers: 0.21.0 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ```