youssefkhalil320's picture
Upload folder using huggingface_hub
39647cf verified
metadata
base_model: sentence-transformers/all-MiniLM-L6-v2
language:
  - en
library_name: sentence-transformers
license: apache-2.0
metrics:
  - pearson_cosine
  - spearman_cosine
  - pearson_manhattan
  - spearman_manhattan
  - pearson_euclidean
  - spearman_euclidean
  - pearson_dot
  - spearman_dot
  - pearson_max
  - spearman_max
pipeline_tag: sentence-similarity
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:510287
  - loss:CoSENTLoss
widget:
  - source_sentence: bag
    sentences:
      - bag
      - summer colors bag
      - carry all bag
  - source_sentence: bean bag
    sentences:
      - bag
      - havan bag
      - black yellow shoes
  - source_sentence: pyramid shaped cushion mattress
    sentences:
      - dress
      - silver bag
      - women shoes
  - source_sentence: handcrafted rug
    sentences:
      - amaga  cross bag - white
      - handcrafted boots
      - polyester top
  - source_sentence: bean bag
    sentences:
      - bag
      - v-neck dress
      - bag
model-index:
  - name: all-MiniLM-L6-v2-pair_score
    results:
      - task:
          type: semantic-similarity
          name: Semantic Similarity
        dataset:
          name: sts dev
          type: sts-dev
        metrics:
          - type: pearson_cosine
            value: -0.13726370961372045
            name: Pearson Cosine
          - type: spearman_cosine
            value: -0.16645918619928507
            name: Spearman Cosine
          - type: pearson_manhattan
            value: -0.1405300294713842
            name: Pearson Manhattan
          - type: spearman_manhattan
            value: -0.16334559546016153
            name: Spearman Manhattan
          - type: pearson_euclidean
            value: -0.1432496898556385
            name: Pearson Euclidean
          - type: spearman_euclidean
            value: -0.16645904911745338
            name: Spearman Euclidean
          - type: pearson_dot
            value: -0.13726370008450378
            name: Pearson Dot
          - type: spearman_dot
            value: -0.1664594964294906
            name: Spearman Dot
          - type: pearson_max
            value: -0.13726370008450378
            name: Pearson Max
          - type: spearman_max
            value: -0.16334559546016153
            name: Spearman Max

all-MiniLM-L6-v2-pair_score

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: sentence-transformers/all-MiniLM-L6-v2
  • Maximum Sequence Length: 256 tokens
  • Output Dimensionality: 384 tokens
  • Similarity Function: Cosine Similarity
  • Language: en
  • License: apache-2.0

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'bean bag',
    'bag',
    'v-neck dress',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Semantic Similarity

Metric Value
pearson_cosine -0.1373
spearman_cosine -0.1665
pearson_manhattan -0.1405
spearman_manhattan -0.1633
pearson_euclidean -0.1432
spearman_euclidean -0.1665
pearson_dot -0.1373
spearman_dot -0.1665
pearson_max -0.1373
spearman_max -0.1633

Training Details

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 32
  • learning_rate: 2e-05
  • num_train_epochs: 4
  • warmup_ratio: 0.1
  • fp16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 32
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 4
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss loss sts-dev_spearman_cosine
0 0 - - -0.1665
0.0063 100 11.9622 - -
0.0125 200 11.265 - -
0.0188 300 10.5195 - -
0.0251 400 9.4744 - -
0.0314 500 8.4815 8.6217 -
0.0376 600 7.6105 - -
0.0439 700 6.8023 - -
0.0502 800 6.1258 - -
0.0564 900 5.5032 - -
0.0627 1000 5.0397 5.1949 -
0.0690 1100 4.6909 - -
0.0752 1200 4.5716 - -
0.0815 1300 4.3983 - -
0.0878 1400 4.2073 - -
0.0941 1500 4.2164 4.1422 -
0.1003 1600 4.0921 - -
0.1066 1700 4.1785 - -
0.1129 1800 4.0503 - -
0.1191 1900 3.8969 - -
0.1254 2000 3.8538 3.9109 -
0.1317 2100 3.872 - -
0.1380 2200 3.851 - -
0.1442 2300 3.6301 - -
0.1505 2400 3.5202 - -
0.1568 2500 3.6759 3.6389 -
0.1630 2600 3.4106 - -
0.1693 2700 3.69 - -
0.1756 2800 3.6336 - -
0.1819 2900 3.4715 - -
0.1881 3000 3.2166 3.2739 -
0.1944 3100 3.3844 - -
0.2007 3200 3.4449 - -
0.2069 3300 3.0811 - -
0.2132 3400 3.2777 - -
0.2195 3500 2.9505 3.0865 -
0.2257 3600 3.1534 - -
0.2320 3700 2.9669 - -
0.2383 3800 2.9416 - -
0.2446 3900 2.9637 - -
0.2508 4000 2.9322 2.8447 -
0.2571 4100 2.6926 - -
0.2634 4200 2.9353 - -
0.2696 4300 2.635 - -
0.2759 4400 2.5692 - -
0.2822 4500 3.0283 2.9033 -
0.2885 4600 2.5804 - -
0.2947 4700 3.1374 - -
0.3010 4800 2.8479 - -
0.3073 4900 2.6809 - -
0.3135 5000 2.8267 2.6946 -
0.3198 5100 2.7341 - -
0.3261 5200 2.8157 - -
0.3324 5300 2.5867 - -
0.3386 5400 2.8622 - -
0.3449 5500 2.9063 2.6115 -
0.3512 5600 2.1514 - -
0.3574 5700 2.3755 - -
0.3637 5800 2.5055 - -
0.3700 5900 3.3237 - -
0.3762 6000 2.561 2.7512 -
0.3825 6100 2.4351 - -
0.3888 6200 2.8472 - -
0.3951 6300 2.76 - -
0.4013 6400 2.1947 - -
0.4076 6500 2.6409 2.5367 -
0.4139 6600 2.7262 - -
0.4201 6700 2.7781 - -
0.4264 6800 2.4718 - -
0.4327 6900 2.567 - -
0.4390 7000 2.4215 2.3409 -
0.4452 7100 1.9308 - -
0.4515 7200 2.1232 - -
0.4578 7300 2.421 - -
0.4640 7400 2.3232 - -
0.4703 7500 2.8543 2.3706 -
0.4766 7600 2.4276 - -
0.4828 7700 2.4507 - -
0.4891 7800 2.1963 - -
0.4954 7900 2.4247 - -
0.5017 8000 2.1948 2.5729 -
0.5079 8100 2.4069 - -
0.5142 8200 2.4328 - -
0.5205 8300 2.2198 - -
0.5267 8400 2.1746 - -
0.5330 8500 2.2618 2.3459 -
0.5393 8600 2.3909 - -
0.5456 8700 2.035 - -
0.5518 8800 2.2626 - -
0.5581 8900 2.1541 - -
0.5644 9000 1.9424 2.1625 -
0.5706 9100 2.5152 - -
0.5769 9200 2.0462 - -
0.5832 9300 1.6124 - -
0.5895 9400 2.2236 - -
0.5957 9500 2.4706 2.0569 -
0.6020 9600 2.4612 - -
0.6083 9700 2.2784 - -
0.6145 9800 1.9335 - -
0.6208 9900 2.3779 - -
0.6271 10000 1.6778 2.1123 -
0.6333 10100 2.4721 - -
0.6396 10200 1.7822 - -
0.6459 10300 2.077 - -
0.6522 10400 1.9223 - -
0.6584 10500 2.3513 1.8403 -
0.6647 10600 2.1387 - -
0.6710 10700 2.1853 - -
0.6772 10800 1.8715 - -
0.6835 10900 1.8581 - -
0.6898 11000 2.0076 2.0063 -
0.6961 11100 2.3144 - -
0.7023 11200 2.0942 - -
0.7086 11300 1.9117 - -
0.7149 11400 2.2214 - -
0.7211 11500 1.9678 1.9029 -
0.7274 11600 1.7459 - -
0.7337 11700 2.0616 - -
0.7400 11800 1.6169 - -
0.7462 11900 1.5674 - -
0.7525 12000 1.4956 1.8267 -
0.7588 12100 2.3816 - -
0.7650 12200 2.2387 - -
0.7713 12300 1.4625 - -
0.7776 12400 2.028 - -
0.7838 12500 2.151 1.7581 -
0.7901 12600 1.6896 - -
0.7964 12700 1.8526 - -
0.8027 12800 1.9745 - -
0.8089 12900 2.1042 - -
0.8152 13000 1.83 1.5667 -
0.8215 13100 1.7451 - -
0.8277 13200 1.568 - -
0.8340 13300 1.4432 - -
0.8403 13400 1.9172 - -
0.8466 13500 1.9438 1.6055 -
0.8528 13600 1.6488 - -
0.8591 13700 1.8166 - -
0.8654 13800 1.5929 - -
0.8716 13900 1.2476 - -
0.8779 14000 1.5236 1.8921 -
0.8842 14100 1.6538 - -
0.8904 14200 1.8689 - -
0.8967 14300 1.0831 - -
0.9030 14400 1.7765 - -
0.9093 14500 1.3548 1.6683 -
0.9155 14600 1.7792 - -
0.9218 14700 1.73 - -
0.9281 14800 1.5979 - -
0.9343 14900 1.3678 - -
0.9406 15000 2.0664 1.5161 -
0.9469 15100 1.4472 - -
0.9532 15200 1.447 - -
0.9594 15300 1.7261 - -
0.9657 15400 1.4881 - -
0.9720 15500 1.313 1.6227 -
0.9782 15600 1.4587 - -
0.9845 15700 2.0982 - -
0.9908 15800 1.4854 - -
0.9971 15900 1.343 - -
1.0033 16000 1.1795 1.5639 -
1.0096 16100 1.4001 - -
1.0159 16200 1.3867 - -
1.0221 16300 1.5191 - -
1.0284 16400 1.4693 - -
1.0347 16500 1.628 1.4716 -
1.0409 16600 1.0041 - -
1.0472 16700 1.7728 - -
1.0535 16800 1.5586 - -
1.0598 16900 1.7229 - -
1.0660 17000 1.5556 1.4676 -
1.0723 17100 1.2529 - -
1.0786 17200 1.4787 - -
1.0848 17300 1.1947 - -
1.0911 17400 1.3014 - -
1.0974 17500 1.3743 1.4624 -
1.1037 17600 1.3397 - -
1.1099 17700 1.3062 - -
1.1162 17800 1.3288 - -
1.1225 17900 2.0002 - -
1.1287 18000 2.0294 1.4185 -
1.1350 18100 1.5053 - -
1.1413 18200 1.3657 - -
1.1476 18300 1.3877 - -
1.1538 18400 1.9034 - -
1.1601 18500 1.4001 1.3813 -
1.1664 18600 1.7503 - -
1.1726 18700 1.1482 - -
1.1789 18800 1.0958 - -
1.1852 18900 1.2657 - -
1.1914 19000 1.3721 1.4702 -
1.1977 19100 1.2361 - -
1.2040 19200 1.003 - -
1.2103 19300 1.3677 - -
1.2165 19400 1.668 - -
1.2228 19500 1.2026 1.3641 -
1.2291 19600 1.1754 - -
1.2353 19700 1.3196 - -
1.2416 19800 1.4766 - -
1.2479 19900 1.389 - -
1.2542 20000 1.6974 1.3344 -
1.2604 20100 1.5036 - -
1.2667 20200 1.1728 - -
1.2730 20300 1.6058 - -
1.2792 20400 1.5191 - -
1.2855 20500 1.4516 1.3210 -
1.2918 20600 1.3485 - -
1.2980 20700 1.2598 - -
1.3043 20800 1.5871 - -
1.3106 20900 1.1965 - -
1.3169 21000 1.3983 1.2517 -
1.3231 21100 1.2605 - -
1.3294 21200 1.5629 - -
1.3357 21300 1.0668 - -
1.3419 21400 1.1879 - -
1.3482 21500 1.132 1.3881 -
1.3545 21600 1.7231 - -
1.3608 21700 1.7636 - -
1.3670 21800 1.1193 - -
1.3733 21900 1.4662 - -
1.3796 22000 2.0394 1.1927 -
1.3858 22100 1.1535 - -
1.3921 22200 1.4592 - -
1.3984 22300 1.276 - -
1.4047 22400 1.2984 - -
1.4109 22500 0.9741 1.2707 -
1.4172 22600 1.4253 - -
1.4235 22700 1.0769 - -
1.4297 22800 0.8276 - -
1.4360 22900 1.2689 - -
1.4423 23000 1.4817 1.2095 -
1.4485 23100 1.1522 - -
1.4548 23200 0.8978 - -
1.4611 23300 1.015 - -
1.4674 23400 1.0351 - -
1.4736 23500 1.3959 1.1969 -
1.4799 23600 1.2879 - -
1.4862 23700 1.0651 - -
1.4924 23800 1.1601 - -
1.4987 23900 1.0034 - -
1.5050 24000 1.3386 1.1590 -
1.5113 24100 1.142 - -
1.5175 24200 1.3495 - -
1.5238 24300 0.9993 - -
1.5301 24400 0.9363 - -
1.5363 24500 1.4402 1.2178 -
1.5426 24600 1.0648 - -
1.5489 24700 1.5102 - -
1.5552 24800 1.3415 - -
1.5614 24900 0.7441 - -
1.5677 25000 0.901 1.1982 -
1.5740 25100 1.3147 - -
1.5802 25200 0.971 - -
1.5865 25300 0.9988 - -
1.5928 25400 1.1445 - -
1.5990 25500 1.1018 1.1423 -
1.6053 25600 1.0902 - -
1.6116 25700 1.2577 - -
1.6179 25800 1.2005 - -
1.6241 25900 1.2839 - -
1.6304 26000 1.4122 1.1125 -
1.6367 26100 0.7832 - -
1.6429 26200 1.3278 - -
1.6492 26300 1.2055 - -
1.6555 26400 1.5814 - -
1.6618 26500 1.0393 1.0946 -
1.6680 26600 1.4531 - -
1.6743 26700 1.4162 - -
1.6806 26800 0.8498 - -
1.6868 26900 1.1318 - -
1.6931 27000 1.3287 1.0439 -
1.6994 27100 1.0886 - -
1.7056 27200 0.8991 - -
1.7119 27300 0.7563 - -
1.7182 27400 0.9284 - -
1.7245 27500 1.3388 1.0940 -
1.7307 27600 1.2951 - -
1.7370 27700 0.9789 - -
1.7433 27800 1.2898 - -
1.7495 27900 0.9915 - -
1.7558 28000 1.5349 1.0266 -
1.7621 28100 1.124 - -
1.7684 28200 0.809 - -
1.7746 28300 0.9617 - -
1.7809 28400 1.3061 - -
1.7872 28500 1.1323 1.0488 -
1.7934 28600 1.2991 - -
1.7997 28700 0.8708 - -
1.8060 28800 0.7493 - -
1.8123 28900 1.004 - -
1.8185 29000 1.1477 1.0206 -
1.8248 29100 1.1826 - -
1.8311 29200 1.0961 - -
1.8373 29300 1.4743 - -
1.8436 29400 0.8413 - -
1.8499 29500 1.2623 1.0047 -
1.8561 29600 0.8486 - -
1.8624 29700 1.4481 - -
1.8687 29800 1.2704 - -
1.8750 29900 1.1913 - -
1.8812 30000 0.9369 1.0277 -
1.8875 30100 1.2427 - -
1.8938 30200 1.0576 - -
1.9000 30300 0.9188 - -
1.9063 30400 1.3227 - -
1.9126 30500 1.4614 1.0550 -
1.9189 30600 1.2316 - -
1.9251 30700 0.9487 - -
1.9314 30800 1.1651 - -
1.9377 30900 1.1622 - -
1.9439 31000 1.1801 0.9981 -
1.9502 31100 0.8798 - -
1.9565 31200 0.7196 - -
1.9628 31300 1.2003 - -
1.9690 31400 1.1823 - -
1.9753 31500 1.1453 1.0320 -
1.9816 31600 1.4751 - -
1.9878 31700 0.8502 - -
1.9941 31800 0.8757 - -
2.0004 31900 1.0489 - -
2.0066 32000 1.4672 1.0571 -
2.0129 32100 0.9474 - -
2.0192 32200 0.8037 - -
2.0255 32300 0.9782 - -
2.0317 32400 0.6943 - -
2.0380 32500 1.0097 0.9797 -
2.0443 32600 0.9067 - -
2.0505 32700 1.09 - -
2.0568 32800 0.8464 - -
2.0631 32900 0.9359 - -
2.0694 33000 0.813 0.9907 -
2.0756 33100 0.8738 - -
2.0819 33200 0.8178 - -
2.0882 33300 1.1704 - -
2.0944 33400 1.0073 - -
2.1007 33500 1.1849 0.9582 -
2.1070 33600 0.7795 - -
2.1133 33700 0.7688 - -
2.1195 33800 0.9465 - -
2.1258 33900 1.0883 - -
2.1321 34000 0.7711 0.9557 -
2.1383 34100 0.9767 - -
2.1446 34200 0.6702 - -
2.1509 34300 0.9444 - -
2.1571 34400 0.8741 - -
2.1634 34500 1.0717 0.9526 -
2.1697 34600 0.8584 - -
2.1760 34700 0.8926 - -
2.1822 34800 0.8567 - -
2.1885 34900 0.71 - -
2.1948 35000 1.1285 0.9589 -
2.2010 35100 0.8999 - -
2.2073 35200 0.8459 - -
2.2136 35300 1.0608 - -
2.2199 35400 0.6115 - -
2.2261 35500 1.2468 0.9769 -
2.2324 35600 0.9987 - -
2.2387 35700 0.9186 - -
2.2449 35800 1.0505 - -
2.2512 35900 0.6253 - -
2.2575 36000 0.6523 0.9501 -
2.2637 36100 0.8252 - -
2.2700 36200 0.9793 - -
2.2763 36300 0.8845 - -
2.2826 36400 1.0121 - -
2.2888 36500 0.9849 0.9245 -
2.2951 36600 1.2937 - -
2.3014 36700 1.0484 - -
2.3076 36800 0.8801 - -
2.3139 36900 0.7552 - -
2.3202 37000 0.7641 0.9280 -
2.3265 37100 0.883 - -
2.3327 37200 0.77 - -
2.3390 37300 1.2699 - -
2.3453 37400 0.8766 - -
2.3515 37500 1.1154 0.9623 -
2.3578 37600 1.0634 - -
2.3641 37700 0.8822 - -
2.3704 37800 0.839 - -
2.3766 37900 0.684 - -
2.3829 38000 0.8051 0.9198 -
2.3892 38100 0.9585 - -
2.3954 38200 0.7156 - -
2.4017 38300 0.5271 - -
2.4080 38400 0.805 - -
2.4142 38500 0.7898 0.8785 -
2.4205 38600 0.6935 - -
2.4268 38700 0.8011 - -
2.4331 38800 0.9812 - -
2.4393 38900 0.4427 - -
2.4456 39000 0.492 0.9313 -
2.4519 39100 0.47 - -
2.4581 39200 1.1876 - -
2.4644 39300 0.5778 - -
2.4707 39400 0.6763 - -
2.4770 39500 0.6896 0.8978 -
2.4832 39600 0.8905 - -
2.4895 39700 0.7845 - -
2.4958 39800 0.8691 - -
2.5020 39900 0.55 - -
2.5083 40000 0.6978 0.9054 -
2.5146 40100 0.6378 - -
2.5209 40200 0.895 - -
2.5271 40300 0.9683 - -
2.5334 40400 0.9373 - -
2.5397 40500 0.7406 0.9128 -
2.5459 40600 0.8917 - -
2.5522 40700 1.0552 - -
2.5585 40800 0.5281 - -
2.5647 40900 0.9064 - -
2.5710 41000 0.6886 0.9049 -
2.5773 41100 0.7166 - -
2.5836 41200 0.8343 - -
2.5898 41300 0.9468 - -
2.5961 41400 0.8529 - -
2.6024 41500 0.8092 0.8954 -
2.6086 41600 0.8501 - -
2.6149 41700 0.9877 - -
2.6212 41800 0.8592 - -
2.6275 41900 0.8632 - -
2.6337 42000 0.6766 0.8707 -
2.6400 42100 0.7587 - -
2.6463 42200 0.8949 - -
2.6525 42300 0.4173 - -
2.6588 42400 0.5995 - -
2.6651 42500 0.8157 0.8681 -
2.6713 42600 0.92 - -
2.6776 42700 0.9118 - -
2.6839 42800 0.7446 - -
2.6902 42900 0.6835 - -
2.6964 43000 0.6157 0.8691 -
2.7027 43100 0.5423 - -
2.7090 43200 0.8098 - -
2.7152 43300 0.8908 - -
2.7215 43400 1.1275 - -
2.7278 43500 1.0345 0.8884 -
2.7341 43600 0.6198 - -
2.7403 43700 0.8315 - -
2.7466 43800 0.9317 - -
2.7529 43900 0.516 - -
2.7591 44000 0.8229 0.8659 -
2.7654 44100 0.7989 - -
2.7717 44200 0.9291 - -
2.7780 44300 0.5954 - -
2.7842 44400 0.8537 - -
2.7905 44500 0.9506 0.8657 -
2.7968 44600 0.5789 - -
2.8030 44700 0.4861 - -
2.8093 44800 0.9614 - -
2.8156 44900 1.0069 - -
2.8218 45000 0.5599 0.8619 -
2.8281 45100 1.3747 - -
2.8344 45200 0.5638 - -
2.8407 45300 1.2095 - -
2.8469 45400 0.7364 - -
2.8532 45500 0.5692 0.8818 -
2.8595 45600 0.8848 - -
2.8657 45700 0.9063 - -
2.8720 45800 0.8675 - -
2.8783 45900 0.9703 - -
2.8846 46000 0.6657 0.8424 -
2.8908 46100 0.6564 - -
2.8971 46200 0.7945 - -
2.9034 46300 0.6341 - -
2.9096 46400 1.042 - -
2.9159 46500 1.0812 0.8510 -
2.9222 46600 0.9787 - -
2.9285 46700 0.8732 - -
2.9347 46800 1.1872 - -
2.9410 46900 0.989 - -
2.9473 47000 0.874 0.8215 -
2.9535 47100 1.0229 - -
2.9598 47200 0.9888 - -
2.9661 47300 0.4883 - -
2.9723 47400 0.7474 - -
2.9786 47500 0.7615 0.8218 -
2.9849 47600 0.6208 - -
2.9912 47700 0.8332 - -
2.9974 47800 0.6734 - -
3.0037 47900 0.5095 - -
3.0100 48000 0.7709 0.8220 -
3.0162 48100 0.5449 - -
3.0225 48200 0.772 - -
3.0288 48300 0.8582 - -
3.0351 48400 0.5742 - -
3.0413 48500 0.5584 0.8493 -
3.0476 48600 0.9766 - -
3.0539 48700 0.6473 - -
3.0601 48800 0.5861 - -
3.0664 48900 0.6377 - -
3.0727 49000 0.8393 0.8430 -
3.0789 49100 0.8385 - -
3.0852 49200 0.5523 - -
3.0915 49300 0.6217 - -
3.0978 49400 0.5515 - -
3.1040 49500 0.851 0.8000 -
3.1103 49600 0.9247 - -
3.1166 49700 0.655 - -
3.1228 49800 0.4979 - -
3.1291 49900 0.7521 - -
3.1354 50000 0.53 0.8105 -
3.1417 50100 0.5943 - -
3.1479 50200 0.4659 - -
3.1542 50300 0.4843 - -
3.1605 50400 0.7577 - -
3.1667 50500 0.3448 0.8055 -
3.1730 50600 0.8392 - -
3.1793 50700 0.75 - -
3.1856 50800 0.5195 - -
3.1918 50900 0.617 - -
3.1981 51000 0.6892 0.8293 -
3.2044 51100 0.497 - -
3.2106 51200 0.6793 - -
3.2169 51300 0.7251 - -
3.2232 51400 0.6471 - -
3.2294 51500 0.775 0.8013 -
3.2357 51600 0.7289 - -
3.2420 51700 0.6894 - -
3.2483 51800 0.5677 - -
3.2545 51900 0.317 - -
3.2608 52000 0.5376 0.7853 -
3.2671 52100 0.4582 - -
3.2733 52200 0.8505 - -
3.2796 52300 0.6236 - -
3.2859 52400 0.7388 - -
3.2922 52500 0.7061 0.7863 -
3.2984 52600 0.5411 - -
3.3047 52700 0.9511 - -
3.3110 52800 0.5364 - -
3.3172 52900 0.5795 - -
3.3235 53000 0.5305 0.7876 -
3.3298 53100 0.8051 - -
3.3361 53200 0.5342 - -
3.3423 53300 0.4567 - -
3.3486 53400 0.9751 - -
3.3549 53500 0.4413 0.8008 -
3.3611 53600 0.6011 - -
3.3674 53700 0.4708 - -
3.3737 53800 0.6167 - -
3.3799 53900 0.7653 - -
3.3862 54000 0.7781 0.7897 -
3.3925 54100 0.9323 - -
3.3988 54200 0.6003 - -
3.4050 54300 0.5268 - -
3.4113 54400 0.6639 - -
3.4176 54500 0.388 0.7855 -
3.4238 54600 0.7258 - -
3.4301 54700 0.6475 - -
3.4364 54800 0.795 - -
3.4427 54900 0.4978 - -
3.4489 55000 0.6259 0.7705 -
3.4552 55100 0.791 - -
3.4615 55200 0.7602 - -
3.4677 55300 0.2236 - -
3.4740 55400 0.5577 - -
3.4803 55500 0.4214 0.7683 -
3.4865 55600 0.7335 - -
3.4928 55700 0.7536 - -
3.4991 55800 0.4577 - -
3.5054 55900 0.5869 - -
3.5116 56000 0.8563 0.7587 -
3.5179 56100 0.9291 - -
3.5242 56200 0.4387 - -
3.5304 56300 0.4491 - -
3.5367 56400 0.506 - -
3.5430 56500 0.6626 0.7634 -
3.5493 56600 0.8654 - -
3.5555 56700 0.4455 - -
3.5618 56800 0.4593 - -
3.5681 56900 0.878 - -
3.5743 57000 0.3737 0.7617 -
3.5806 57100 0.377 - -
3.5869 57200 0.6894 - -
3.5932 57300 0.6635 - -
3.5994 57400 0.9224 - -
3.6057 57500 0.635 0.7669 -
3.6120 57600 0.6797 - -
3.6182 57700 0.9814 - -
3.6245 57800 0.9893 - -
3.6308 57900 0.6753 - -
3.6370 58000 0.8349 0.7501 -
3.6433 58100 0.8523 - -
3.6496 58200 0.2962 - -
3.6559 58300 0.6585 - -
3.6621 58400 1.0247 - -
3.6684 58500 0.8638 0.7577 -
3.6747 58600 0.9456 - -
3.6809 58700 0.5401 - -
3.6872 58800 0.6602 - -
3.6935 58900 0.7543 - -
3.6998 59000 0.7893 0.7600 -
3.7060 59100 0.7746 - -
3.7123 59200 0.6539 - -
3.7186 59300 0.8083 - -
3.7248 59400 0.3429 - -
3.7311 59500 0.5005 0.7445 -
3.7374 59600 0.6238 - -
3.7437 59700 0.4343 - -
3.7499 59800 0.8189 - -
3.7562 59900 0.6272 - -
3.7625 60000 0.2982 0.7597 -
3.7687 60100 0.7028 - -
3.7750 60200 0.9447 - -
3.7813 60300 0.6175 - -
3.7875 60400 0.5856 - -
3.7938 60500 0.8249 0.7505 -
3.8001 60600 0.6617 - -
3.8064 60700 0.5767 - -
3.8126 60800 1.0094 - -
3.8189 60900 0.471 - -
3.8252 61000 0.6313 0.7489 -
3.8314 61100 0.6545 - -
3.8377 61200 0.699 - -
3.8440 61300 0.6272 - -
3.8503 61400 0.7375 - -
3.8565 61500 0.4213 0.7490 -
3.8628 61600 0.6631 - -
3.8691 61700 0.552 - -
3.8753 61800 0.7041 - -
3.8816 61900 0.8457 - -
3.8879 62000 0.8104 0.7477 -
3.8941 62100 0.4494 - -
3.9004 62200 0.6947 - -
3.9067 62300 0.8061 - -
3.9130 62400 0.416 - -
3.9192 62500 0.7359 0.7468 -
3.9255 62600 0.7408 - -
3.9318 62700 0.6255 - -
3.9380 62800 0.7865 - -
3.9443 62900 0.4879 - -
3.9506 63000 0.5196 0.7485 -
3.9569 63100 0.5683 - -
3.9631 63200 0.5141 - -
3.9694 63300 0.6068 - -
3.9757 63400 0.5929 - -
3.9819 63500 0.7513 0.7482 -
3.9882 63600 0.5053 - -
3.9945 63700 0.5707 - -

Framework Versions

  • Python: 3.8.10
  • Sentence Transformers: 3.1.1
  • Transformers: 4.45.1
  • PyTorch: 2.4.0+cu121
  • Accelerate: 0.34.2
  • Datasets: 3.0.1
  • Tokenizers: 0.20.0

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CoSENTLoss

@online{kexuefm-8847,
    title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
    author={Su Jianlin},
    year={2022},
    month={Jan},
    url={https://kexue.fm/archives/8847},
}