SentenceTransformer based on seongil-dn/unsupervised_20m_3800
This is a sentence-transformers model finetuned from seongil-dn/unsupervised_20m_3800. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: seongil-dn/unsupervised_20m_3800
- Maximum Sequence Length: 1024 tokens
- Output Dimensionality: 1024 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 1024, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("seongil-dn/bge-m3-420")
# Run inference
sentences = [
'Peter Stuyvesant, born in Holland, became Governor of which American city in 1647?',
'Peter Stuyvesant (cigarette) half of its regular users"" and called the packaging changes ""the ultimate sick joke from big tobacco"". In 2013, it was reported that Imperial Tobacco Australia had sent marketing material to WA tobacco retailers which promotes limited edition packs of "Peter Stuyvesant + Loosie", which came with 26 cigarettes. The material included images of a young woman with pink hair putting on lipstick and men on the streets of New York and also included a calendar and small poster that were clearly intended to glamorise smoking. Anti-smoking campaigner Mike Daube said although the material did not break the law because',
'Peter Stuyvesant (cigarette) can amount to millions of dollars and finally criminal prosecution - if companies wilfully break the laws. However last year, when questioned on why no such action was being pursued against Imperial Tobacco a spokeswoman for Federal Health said: ""No instances of non-compliance with the Act have been identified by the Department that warrant the initiation of Court proceedings in the first instance, and without attempting alternative dispute resolution to achieve compliance"". Peter Stuyvesant is or was sold in the following countries: Canada, United States, United Kingdom, Luxembourg, Belgium, The Netherlands, Germany, France, Austria, Switzerland, Spain, Italy, Czech Republic, Greece,',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Training Details
Training Dataset
Unnamed Dataset
- Size: 1,138,596 training samples
- Columns:
anchor
,positive
,negative
,negative_2
,negative_3
,negative_4
, andnegative_5
- Approximate statistics based on the first 1000 samples:
anchor positive negative negative_2 negative_3 negative_4 negative_5 type string string string string string string string details - min: 9 tokens
- mean: 22.32 tokens
- max: 119 tokens
- min: 127 tokens
- mean: 157.45 tokens
- max: 420 tokens
- min: 122 tokens
- mean: 154.65 tokens
- max: 212 tokens
- min: 122 tokens
- mean: 155.52 tokens
- max: 218 tokens
- min: 122 tokens
- mean: 156.04 tokens
- max: 284 tokens
- min: 124 tokens
- mean: 156.3 tokens
- max: 268 tokens
- min: 121 tokens
- mean: 156.15 tokens
- max: 249 tokens
- Samples:
anchor positive negative negative_2 negative_3 negative_4 negative_5 What African country is projected to pass the United States in population by the year 2055?
African immigration to the United States officially 40,000 African immigrants, although it has been estimated that the population is actually four times this number when considering undocumented immigrants. The majority of these immigrants were born in Ethiopia, Egypt, Nigeria, and South Africa. African immigrants like many other immigrant groups are likely to establish and find success in small businesses. Many Africans that have seen the social and economic stability that comes from ethnic enclaves such as Chinatowns have recently been establishing ethnic enclaves of their own at much higher rates to reap the benefits of such communities. Such examples include Little Ethiopia in Los Angeles and
What Will Happen to the Gang Next Year? watching television at the time of the broadcast. This made it the lowest-rated episode in "30 Rock"'s history. and a decrease from the previous episode "The Return of Avery Jessup" (2.92 million) What Will Happen to the Gang Next Year? "What Will Happen to the Gang Next Year?" is the twenty-second and final episode of the sixth season of the American television comedy series "30 Rock", and the 125th overall episode of the series. It was directed by Michael Engler, and written by Matt Hubbard. The episode originally aired on the National Broadcasting Company (NBC) network in the United States
Christianity in the United States Christ is the fifth-largest denomination, the largest Pentecostal church, and the largest traditionally African-American denomination in the nation. Among Eastern Christian denominations, there are several Eastern Orthodox and Oriental Orthodox churches, with just below 1 million adherents in the US, or 0.4% of the total population. Christianity was introduced to the Americas as it was first colonized by Europeans beginning in the 16th and 17th centuries. Going forward from its foundation, the United States has been called a Protestant nation by a variety of sources. Immigration further increased Christian numbers. Today most Christian churches in the United States are either
What Will Happen to the Gang Next Year? What Will Happen to the Gang Next Year? "What Will Happen to the Gang Next Year?" is the twenty-second and final episode of the sixth season of the American television comedy series "30 Rock", and the 125th overall episode of the series. It was directed by Michael Engler, and written by Matt Hubbard. The episode originally aired on the National Broadcasting Company (NBC) network in the United States on May 17, 2012. In the episode, Jack (Alec Baldwin) and Avery (Elizabeth Banks) seek to renew their vows; Criss (James Marsden) sets out to show Liz (Tina Fey) he can pay
History of the Jews in the United States Representatives by Rep. Samuel Dickstein (D; New York). This also failed to pass. During the Holocaust, fewer than 30,000 Jews a year reached the United States, and some were turned away due to immigration policies. The U.S. did not change its immigration policies until 1948. Currently, laws requiring teaching of the Holocaust are on the books in five states. The Holocaust had a profound impact on the community in the United States, especially after 1960, as Jews tried to comprehend what had happened, and especially to commemorate and grapple with it when looking to the future. Abraham Joshua Heschel summarized
Public holidays in the United States will have very few customers that day. The labor force in the United States comprises about 62% (as of 2014) of the general population. In the United States, 97% of the private sector businesses determine what days this sector of the population gets paid time off, according to a study by the Society for Human Resource Management. The following holidays are observed by the majority of US businesses with paid time off: This list of holidays is based off the official list of federal holidays by year from the US Government. The holidays however are at the discretion of employers
Which is the largest species of the turtle family?
Loggerhead sea turtle turtle is debated, but most authors consider it a single polymorphic species. Molecular genetics has confirmed hybridization of the loggerhead sea turtle with the Kemp's ridley sea turtle, hawksbill sea turtle, and green sea turtles. The extent of natural hybridization is not yet determined; however, second-generation hybrids have been reported, suggesting some hybrids are fertile. Although evidence is lacking, modern sea turtles probably descended from a single common ancestor during the Cretaceous period. Like all other sea turtles except the leatherback, loggerheads are members of the ancient family Cheloniidae, and appeared about 40 million years ago. Of the six species
Convention on the Conservation of Migratory Species of Wild Animals take joint action. At May 2018, there were 126 Parties to the Convention. The CMS Family covers a great diversity of migratory species. The Appendices of CMS include many mammals, including land mammals, marine mammals and bats; birds; fish; reptiles and one insect. Among the instruments, AEWA covers 254 species of birds that are ecologically dependent on wetlands for at least part of their annual cycle. EUROBATS covers 52 species of bat, the Memorandum of Understanding on the Conservation of Migratory Sharks seven species of shark, the IOSEA Marine Turtle MOU six species of marine turtle and the Raptors MoU
Razor-backed musk turtle Razor-backed musk turtle The razor-backed musk turtle ("Sternotherus carinatus") is a species of turtle in the family Kinosternidae. The species is native to the southern United States. There are no subspecies that are recognized as being valid. "S. carinatus" is found in the states of Alabama, Arkansas, Louisiana, Mississippi, Oklahoma, and Texas. The razor-backed musk turtle grows to a straight carapace length of about . It has a brown-colored carapace, with black markings at the edges of each scute. The carapace has a distinct, sharp keel down the center of its length, giving the species its common name. The body
African helmeted turtle African helmeted turtle The African helmeted turtle ("Pelomedusa subrufa"), also known commonly as the marsh terrapin, the crocodile turtle, or in the pet trade as the African side-necked turtle, is a species of omnivorous side-necked terrapin in the family Pelomedusidae. The species naturally occurs in fresh and stagnant water bodies throughout much of Sub-Saharan Africa, and in southern Yemen. The marsh terrapin is typically a rather small turtle, with most individuals being less than in straight carapace length, but one has been recorded with a length of . It has a black or brown carapace. The top of the tail
Box turtle Box turtle Box turtles are North American turtles of the genus Terrapene. Although box turtles are superficially similar to tortoises in terrestrial habits and overall appearance, they are actually members of the American pond turtle family (Emydidae). The twelve taxa which are distinguished in the genus are distributed over four species. They are largely characterized by having a domed shell, which is hinged at the bottom, allowing the animal to close its shell tightly to escape predators. The genus name "Terrapene" was coined by Merrem in 1820 as a genus separate from "Emys" for those species which had a sternum
Vallarta mud turtle Vallarta mud turtle The Vallarta mud turtle ("Kinosternon vogti") is a recently identified species of mud turtle in the family Kinosternidae. While formerly considered conspecific with the Jalisco mud turtle, further studies indicated that it was a separate species. It can be identified by a combination of the number of plastron and carapace scutes, body size, and the distinctive yellow rostral shield in males. It is endemic to Mexican state of Jalisco. It is only known from a few human-created or human-affected habitats (such as small streams and ponds) found around Puerto Vallarta. It is one of only 3 species
How many gallons of beer are in an English barrel?
Low-alcohol beer Prohibition in the United States. Near beer could not legally be labeled as "beer" and was officially classified as a "cereal beverage". The public, however, almost universally called it "near beer". The most popular "near beer" was Bevo, brewed by the Anheuser-Busch company. The Pabst company brewed "Pablo", Miller brewed "Vivo", and Schlitz brewed "Famo". Many local and regional breweries stayed in business by marketing their own near-beers. By 1921 production of near beer had reached over 300 million US gallons (1 billion L) a year (36 L/s). A popular illegal practice was to add alcohol to near beer. The
Keg terms "half-barrel" and "quarter-barrel" are derived from the U.S. beer barrel, legally defined as being equal to 31 U.S. gallons (this is not the same volume as some other units also known as "barrels"). A 15.5 U.S. gallon keg is also equal to: However, beer kegs can come in many sizes: In European countries the most common keg size is 50 liters. This includes the UK, which uses a non-metric standard keg of 11 imperial gallons, which is coincidentally equal to . The German DIN 6647-1 and DIN 6647-2 have also defined kegs in the sizes of 30 and 20
Beer in Chile craft beers. They are generally low or very low volume producers. In Chile there are more than 150 craft beer producers distributed along the 15 Chilean Regions. The list below includes: Beer in Chile The primary beer brewed and consumed in Chile is pale lager, though the country also has a tradition of brewing corn beer, known as chicha. Chile’s beer history has a strong German influence – some of the bigger beer producers are from the country’s southern lake district, a region populated by a great number of German immigrants during the 19th century. Chile also produces English ale-style
Barrel variation. In modern times, produce barrels for all dry goods, excepting cranberries, contain 7,056 cubic inches, about 115.627 L. Barrel A barrel, cask, or tun is a hollow cylindrical container, traditionally made of wooden staves bound by wooden or metal hoops. Traditionally, the barrel was a standard size of measure referring to a set capacity or weight of a given commodity. For example, in the UK a barrel of beer refers to a quantity of . Wine was shipped in barrels of . Modern wooden barrels for wine-making are either made of French common oak ("Quercus robur") and white oak
The Rare Barrel The Rare Barrel The Rare Barrel is a brewery and brewpub in Berkeley, California, United States, that exclusively produces sour beers. Founders Jay Goodwin and Alex Wallash met while attending UCSB. They started home-brewing in their apartment and decided that they would one day start a brewery together. Goodwin started working at The Bruery, where he worked his way from a production assistant to brewer, eventually becoming the head of their barrel aging program. The Rare Barrel brewed its first batch of beer in February 2013, and opened its tasting room on December 27, 2013. The Rare Barrel was named
Barrel (unit) Barrel (unit) A barrel is one of several units of volume applied in various contexts; there are dry barrels, fluid barrels (such as the UK beer barrel and US beer barrel), oil barrels and so on. For historical reasons the volumes of some barrel units are roughly double the volumes of others; volumes in common usage range from about . In many connections the term "drum" is used almost interchangeably with "barrel". Since medieval times the term barrel as a unit of measure has had various meanings throughout Europe, ranging from about 100 litres to 1000 litres. The name was
- Loss:
CachedGISTEmbedLoss
with these parameters:{'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 1024, 'do_lower_case': False}) with Transformer model: XLMRobertaModel (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.01}
Training Hyperparameters
Non-Default Hyperparameters
per_device_train_batch_size
: 1024learning_rate
: 3e-05weight_decay
: 0.01warmup_ratio
: 0.05bf16
: Truebatch_sampler
: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: noprediction_loss_only
: Trueper_device_train_batch_size
: 1024per_device_eval_batch_size
: 8per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 3e-05weight_decay
: 0.01adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 3max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.05warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Truefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Truedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: no_duplicatesmulti_dataset_batch_sampler
: proportional
Training Logs
Click to expand
Epoch | Step | Training Loss |
---|---|---|
0.0036 | 1 | 1.0283 |
0.0072 | 2 | 1.0155 |
0.0108 | 3 | 0.9858 |
0.0144 | 4 | 0.9519 |
0.0181 | 5 | 0.9434 |
0.0217 | 6 | 0.898 |
0.0253 | 7 | 0.8798 |
0.0289 | 8 | 0.7976 |
0.0325 | 9 | 0.7797 |
0.0361 | 10 | 0.7464 |
0.0397 | 11 | 0.743 |
0.0433 | 12 | 0.716 |
0.0469 | 13 | 0.7076 |
0.0505 | 14 | 0.666 |
0.0542 | 15 | 0.631 |
0.0578 | 16 | 0.5905 |
0.0614 | 17 | 0.6537 |
0.0650 | 18 | 0.5755 |
0.0686 | 19 | 0.5422 |
0.0722 | 20 | 0.5393 |
0.0758 | 21 | 0.5741 |
0.0794 | 22 | 0.498 |
0.0830 | 23 | 0.5522 |
0.0866 | 24 | 0.5592 |
0.0903 | 25 | 0.4797 |
0.0939 | 26 | 0.4684 |
0.0975 | 27 | 0.5207 |
0.1011 | 28 | 0.4692 |
0.1047 | 29 | 0.4459 |
0.1083 | 30 | 0.4439 |
0.1119 | 31 | 0.4656 |
0.1155 | 32 | 0.4737 |
0.1191 | 33 | 0.4391 |
0.1227 | 34 | 0.4386 |
0.1264 | 35 | 0.4107 |
0.1300 | 36 | 0.4513 |
0.1336 | 37 | 0.3789 |
0.1372 | 38 | 0.4103 |
0.1408 | 39 | 0.3929 |
0.1444 | 40 | 0.4226 |
0.1480 | 41 | 0.391 |
0.1516 | 42 | 0.3674 |
0.1552 | 43 | 0.3607 |
0.1588 | 44 | 0.3738 |
0.1625 | 45 | 0.3842 |
0.1661 | 46 | 0.3498 |
0.1697 | 47 | 0.3586 |
0.1733 | 48 | 0.3538 |
0.1769 | 49 | 0.3572 |
0.1805 | 50 | 0.3547 |
0.1841 | 51 | 0.3179 |
0.1877 | 52 | 0.3436 |
0.1913 | 53 | 0.3502 |
0.1949 | 54 | 0.3381 |
0.1986 | 55 | 0.3547 |
0.2022 | 56 | 0.3362 |
0.2058 | 57 | 0.3407 |
0.2094 | 58 | 0.31 |
0.2130 | 59 | 0.3039 |
0.2166 | 60 | 0.3362 |
0.2202 | 61 | 0.2948 |
0.2238 | 62 | 0.3429 |
0.2274 | 63 | 0.3096 |
0.2310 | 64 | 0.35 |
0.2347 | 65 | 0.2997 |
0.2383 | 66 | 0.3258 |
0.2419 | 67 | 0.3376 |
0.2455 | 68 | 0.3213 |
0.2491 | 69 | 0.3185 |
0.2527 | 70 | 0.3282 |
0.2563 | 71 | 0.2988 |
0.2599 | 72 | 0.33 |
0.2635 | 73 | 0.3066 |
0.2671 | 74 | 0.3303 |
0.2708 | 75 | 0.3067 |
0.2744 | 76 | 0.2996 |
0.2780 | 77 | 0.3063 |
0.2816 | 78 | 0.3235 |
0.2852 | 79 | 0.2902 |
0.2888 | 80 | 0.302 |
0.2924 | 81 | 0.3223 |
0.2960 | 82 | 0.297 |
0.2996 | 83 | 0.2936 |
0.3032 | 84 | 0.3279 |
0.3069 | 85 | 0.2973 |
0.3105 | 86 | 0.2881 |
0.3141 | 87 | 0.3014 |
0.3177 | 88 | 0.2986 |
0.3213 | 89 | 0.3057 |
0.3249 | 90 | 0.2887 |
0.3285 | 91 | 0.2765 |
0.3321 | 92 | 0.2818 |
0.3357 | 93 | 0.2904 |
0.3394 | 94 | 0.267 |
0.3430 | 95 | 0.2948 |
0.3466 | 96 | 0.2766 |
0.3502 | 97 | 0.2782 |
0.3538 | 98 | 0.3082 |
0.3574 | 99 | 0.2697 |
0.3610 | 100 | 0.3006 |
0.3646 | 101 | 0.2986 |
0.3682 | 102 | 0.2789 |
0.3718 | 103 | 0.2756 |
0.3755 | 104 | 0.2884 |
0.3791 | 105 | 0.273 |
0.3827 | 106 | 0.2687 |
0.3863 | 107 | 0.2808 |
0.3899 | 108 | 0.2763 |
0.3935 | 109 | 0.2738 |
0.3971 | 110 | 0.2642 |
0.4007 | 111 | 0.2612 |
0.4043 | 112 | 0.2859 |
0.4079 | 113 | 0.2558 |
0.4116 | 114 | 0.2565 |
0.4152 | 115 | 0.2747 |
0.4188 | 116 | 0.2684 |
0.4224 | 117 | 0.2643 |
0.4260 | 118 | 0.241 |
0.4296 | 119 | 0.2563 |
0.4332 | 120 | 0.2754 |
0.4368 | 121 | 0.2503 |
0.4404 | 122 | 0.2544 |
0.4440 | 123 | 0.2729 |
0.4477 | 124 | 0.2589 |
0.4513 | 125 | 0.2626 |
0.4549 | 126 | 0.2693 |
0.4585 | 127 | 0.2687 |
0.4621 | 128 | 0.2903 |
0.4657 | 129 | 0.2663 |
0.4693 | 130 | 0.2604 |
0.4729 | 131 | 0.2601 |
0.4765 | 132 | 0.2649 |
0.4801 | 133 | 0.2597 |
0.4838 | 134 | 0.2608 |
0.4874 | 135 | 0.245 |
0.4910 | 136 | 0.2587 |
0.4946 | 137 | 0.2618 |
0.4982 | 138 | 0.2599 |
0.5018 | 139 | 0.265 |
0.5054 | 140 | 0.2427 |
0.5090 | 141 | 0.2448 |
0.5126 | 142 | 0.2608 |
0.5162 | 143 | 0.2188 |
0.5199 | 144 | 0.2471 |
0.5235 | 145 | 0.2604 |
0.5271 | 146 | 0.2571 |
0.5307 | 147 | 0.2684 |
0.5343 | 148 | 0.2319 |
0.5379 | 149 | 0.2572 |
0.5415 | 150 | 0.2243 |
0.5451 | 151 | 0.2562 |
0.5487 | 152 | 0.2457 |
0.5523 | 153 | 0.255 |
0.5560 | 154 | 0.2664 |
0.5596 | 155 | 0.24 |
0.5632 | 156 | 0.2612 |
0.5668 | 157 | 0.243 |
0.5704 | 158 | 0.2345 |
0.5740 | 159 | 0.2359 |
0.5776 | 160 | 0.2384 |
0.5812 | 161 | 0.2541 |
0.5848 | 162 | 0.2496 |
0.5884 | 163 | 0.2429 |
0.5921 | 164 | 0.2411 |
0.5957 | 165 | 0.2261 |
0.5993 | 166 | 0.2164 |
0.6029 | 167 | 0.2251 |
0.6065 | 168 | 0.2417 |
0.6101 | 169 | 0.2494 |
0.6137 | 170 | 0.2359 |
0.6173 | 171 | 0.2489 |
0.6209 | 172 | 0.2261 |
0.6245 | 173 | 0.2367 |
0.6282 | 174 | 0.2355 |
0.6318 | 175 | 0.2423 |
0.6354 | 176 | 0.2454 |
0.6390 | 177 | 0.2438 |
0.6426 | 178 | 0.2415 |
0.6462 | 179 | 0.2237 |
0.6498 | 180 | 0.2419 |
0.6534 | 181 | 0.2373 |
0.6570 | 182 | 0.2659 |
0.6606 | 183 | 0.2201 |
0.6643 | 184 | 0.2342 |
0.6679 | 185 | 0.2149 |
0.6715 | 186 | 0.2241 |
0.6751 | 187 | 0.2443 |
0.6787 | 188 | 0.2489 |
0.6823 | 189 | 0.2354 |
0.6859 | 190 | 0.2483 |
0.6895 | 191 | 0.2193 |
0.6931 | 192 | 0.229 |
0.6968 | 193 | 0.2335 |
0.7004 | 194 | 0.2484 |
0.7040 | 195 | 0.2317 |
0.7076 | 196 | 0.2203 |
0.7112 | 197 | 0.2329 |
0.7148 | 198 | 0.2084 |
0.7184 | 199 | 0.2341 |
0.7220 | 200 | 0.2369 |
0.7256 | 201 | 0.2364 |
0.7292 | 202 | 0.2276 |
0.7329 | 203 | 0.215 |
0.7365 | 204 | 0.2486 |
0.7401 | 205 | 0.2237 |
0.7437 | 206 | 0.218 |
0.7473 | 207 | 0.2444 |
0.7509 | 208 | 0.2276 |
0.7545 | 209 | 0.2127 |
0.7581 | 210 | 0.2283 |
0.7617 | 211 | 0.2234 |
0.7653 | 212 | 0.207 |
0.7690 | 213 | 0.24 |
0.7726 | 214 | 0.2317 |
0.7762 | 215 | 0.2056 |
0.7798 | 216 | 0.2149 |
0.7834 | 217 | 0.2211 |
0.7870 | 218 | 0.2232 |
0.7906 | 219 | 0.2222 |
0.7942 | 220 | 0.2481 |
0.7978 | 221 | 0.227 |
0.8014 | 222 | 0.2305 |
0.8051 | 223 | 0.2091 |
0.8087 | 224 | 0.2278 |
0.8123 | 225 | 0.2123 |
0.8159 | 226 | 0.2233 |
0.8195 | 227 | 0.2365 |
0.8231 | 228 | 0.2165 |
0.8267 | 229 | 0.2192 |
0.8303 | 230 | 0.2145 |
0.8339 | 231 | 0.2382 |
0.8375 | 232 | 0.2232 |
0.8412 | 233 | 0.2273 |
0.8448 | 234 | 0.2296 |
0.8484 | 235 | 0.2229 |
0.8520 | 236 | 0.2213 |
0.8556 | 237 | 0.2343 |
0.8592 | 238 | 0.2208 |
0.8628 | 239 | 0.2315 |
0.8664 | 240 | 0.2137 |
0.8700 | 241 | 0.2201 |
0.8736 | 242 | 0.2185 |
0.8773 | 243 | 0.2337 |
0.8809 | 244 | 0.2153 |
0.8845 | 245 | 0.2369 |
0.8881 | 246 | 0.2216 |
0.8917 | 247 | 0.2338 |
0.8953 | 248 | 0.2241 |
0.8989 | 249 | 0.213 |
0.9025 | 250 | 0.2245 |
0.9061 | 251 | 0.2074 |
0.9097 | 252 | 0.2283 |
0.9134 | 253 | 0.2003 |
0.9170 | 254 | 0.2099 |
0.9206 | 255 | 0.2288 |
0.9242 | 256 | 0.2168 |
0.9278 | 257 | 0.215 |
0.9314 | 258 | 0.2146 |
0.9350 | 259 | 0.2126 |
0.9386 | 260 | 0.2178 |
0.9422 | 261 | 0.2065 |
0.9458 | 262 | 0.2327 |
0.9495 | 263 | 0.2116 |
0.9531 | 264 | 0.2324 |
0.9567 | 265 | 0.2235 |
0.9603 | 266 | 0.2189 |
0.9639 | 267 | 0.2175 |
0.9675 | 268 | 0.2171 |
0.9711 | 269 | 0.1925 |
0.9747 | 270 | 0.225 |
0.9783 | 271 | 0.2149 |
0.9819 | 272 | 0.204 |
0.9856 | 273 | 0.2004 |
0.9892 | 274 | 0.2055 |
0.9928 | 275 | 0.2045 |
0.9964 | 276 | 0.2186 |
1.0 | 277 | 0.2215 |
1.0036 | 278 | 0.1545 |
1.0072 | 279 | 0.169 |
1.0108 | 280 | 0.152 |
1.0144 | 281 | 0.1597 |
1.0181 | 282 | 0.1626 |
1.0217 | 283 | 0.1692 |
1.0253 | 284 | 0.1639 |
1.0289 | 285 | 0.1638 |
1.0325 | 286 | 0.1507 |
1.0361 | 287 | 0.1594 |
1.0397 | 288 | 0.1621 |
1.0433 | 289 | 0.1565 |
1.0469 | 290 | 0.1549 |
1.0505 | 291 | 0.1731 |
1.0542 | 292 | 0.152 |
1.0578 | 293 | 0.1586 |
1.0614 | 294 | 0.1593 |
1.0650 | 295 | 0.1406 |
1.0686 | 296 | 0.1524 |
1.0722 | 297 | 0.1474 |
1.0758 | 298 | 0.158 |
1.0794 | 299 | 0.1743 |
1.0830 | 300 | 0.1485 |
1.0866 | 301 | 0.1648 |
1.0903 | 302 | 0.1337 |
1.0939 | 303 | 0.1554 |
1.0975 | 304 | 0.1434 |
1.1011 | 305 | 0.1642 |
1.1047 | 306 | 0.159 |
1.1083 | 307 | 0.1658 |
1.1119 | 308 | 0.1554 |
1.1155 | 309 | 0.1425 |
1.1191 | 310 | 0.1432 |
1.1227 | 311 | 0.1517 |
1.1264 | 312 | 0.148 |
1.1300 | 313 | 0.1636 |
1.1336 | 314 | 0.1735 |
1.1372 | 315 | 0.151 |
1.1408 | 316 | 0.1423 |
1.1444 | 317 | 0.1501 |
1.1480 | 318 | 0.1537 |
1.1516 | 319 | 0.1554 |
1.1552 | 320 | 0.1553 |
1.1588 | 321 | 0.149 |
1.1625 | 322 | 0.1605 |
1.1661 | 323 | 0.1551 |
1.1697 | 324 | 0.1555 |
1.1733 | 325 | 0.1443 |
1.1769 | 326 | 0.1533 |
1.1805 | 327 | 0.1658 |
1.1841 | 328 | 0.15 |
1.1877 | 329 | 0.1626 |
1.1913 | 330 | 0.172 |
1.1949 | 331 | 0.1542 |
1.1986 | 332 | 0.166 |
1.2022 | 333 | 0.1513 |
1.2058 | 334 | 0.1612 |
1.2094 | 335 | 0.1521 |
1.2130 | 336 | 0.1552 |
1.2166 | 337 | 0.1503 |
1.2202 | 338 | 0.1613 |
1.2238 | 339 | 0.1563 |
1.2274 | 340 | 0.1429 |
1.2310 | 341 | 0.1587 |
1.2347 | 342 | 0.1477 |
1.2383 | 343 | 0.1561 |
1.2419 | 344 | 0.1418 |
1.2455 | 345 | 0.1495 |
1.2491 | 346 | 0.1533 |
1.2527 | 347 | 0.1521 |
1.2563 | 348 | 0.1422 |
1.2599 | 349 | 0.1446 |
1.2635 | 350 | 0.146 |
1.2671 | 351 | 0.1473 |
1.2708 | 352 | 0.1566 |
1.2744 | 353 | 0.1411 |
1.2780 | 354 | 0.1502 |
1.2816 | 355 | 0.1383 |
1.2852 | 356 | 0.1622 |
1.2888 | 357 | 0.1391 |
1.2924 | 358 | 0.1455 |
1.2960 | 359 | 0.1541 |
1.2996 | 360 | 0.1476 |
1.3032 | 361 | 0.1662 |
1.3069 | 362 | 0.1476 |
1.3105 | 363 | 0.1452 |
1.3141 | 364 | 0.1372 |
1.3177 | 365 | 0.1542 |
1.3213 | 366 | 0.1531 |
1.3249 | 367 | 0.1623 |
1.3285 | 368 | 0.1544 |
1.3321 | 369 | 0.1625 |
1.3357 | 370 | 0.1459 |
1.3394 | 371 | 0.1474 |
1.3430 | 372 | 0.1499 |
1.3466 | 373 | 0.1495 |
1.3502 | 374 | 0.1361 |
1.3538 | 375 | 0.1444 |
1.3574 | 376 | 0.1495 |
1.3610 | 377 | 0.1583 |
1.3646 | 378 | 0.1642 |
1.3682 | 379 | 0.1646 |
1.3718 | 380 | 0.1595 |
1.3755 | 381 | 0.149 |
1.3791 | 382 | 0.1448 |
1.3827 | 383 | 0.1603 |
1.3863 | 384 | 0.1269 |
1.3899 | 385 | 0.1491 |
1.3935 | 386 | 0.1367 |
1.3971 | 387 | 0.1501 |
1.4007 | 388 | 0.1414 |
1.4043 | 389 | 0.156 |
1.4079 | 390 | 0.1428 |
1.4116 | 391 | 0.1559 |
1.4152 | 392 | 0.1452 |
1.4188 | 393 | 0.1547 |
1.4224 | 394 | 0.1432 |
1.4260 | 395 | 0.1648 |
1.4296 | 396 | 0.166 |
1.4332 | 397 | 0.1485 |
1.4368 | 398 | 0.1494 |
1.4404 | 399 | 0.1635 |
1.4440 | 400 | 0.1498 |
1.4477 | 401 | 0.1509 |
1.4513 | 402 | 0.1431 |
1.4549 | 403 | 0.1547 |
1.4585 | 404 | 0.1576 |
1.4621 | 405 | 0.1426 |
1.4657 | 406 | 0.132 |
1.4693 | 407 | 0.1511 |
1.4729 | 408 | 0.1551 |
1.4765 | 409 | 0.16 |
1.4801 | 410 | 0.1507 |
1.4838 | 411 | 0.1591 |
1.4874 | 412 | 0.1536 |
1.4910 | 413 | 0.1507 |
1.4946 | 414 | 0.1564 |
1.4982 | 415 | 0.153 |
1.5018 | 416 | 0.1404 |
1.5054 | 417 | 0.1627 |
1.5090 | 418 | 0.1432 |
1.5126 | 419 | 0.1456 |
1.5162 | 420 | 0.1369 |
Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.4.1
- Transformers: 4.49.0
- PyTorch: 2.5.1+cu124
- Accelerate: 1.4.0
- Datasets: 3.3.2
- Tokenizers: 0.21.0
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
- Downloads last month
- 6
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for seongil-dn/bge-m3-420
Base model
seongil-dn/unsupervised_20m_3800