tomaarsen's picture
tomaarsen HF staff
Update README.md
c159890 verified
metadata
language:
  - en
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:3012496
  - loss:CachedMultipleNegativesRankingLoss
base_model: answerdotai/ModernBERT-base
widget:
  - source_sentence: how much is a car title transfer in minnesota?
    sentences:
      - >-
        This complex is a larger molecule than the original crystal violet stain
        and iodine and is insoluble in water. ... Conversely, the the outer
        membrane of Gram negative bacteria is degraded and the thinner
        peptidoglycan layer of Gram negative cells is unable to retain the
        crystal violet-iodine complex and the color is lost.
      - >-
        Get insurance on the car and provide proof. Bring this information
        (including the title) to the Minnesota DVS office, as well as $10 for
        the filing fee and $7.25 for the titling fee. There is also a $10
        transfer tax, as well as a 6.5% sales tax on the purchase price.
      - >-
        One of the risks of DNP is that it accelerates the metabolism to a
        dangerously fast level. Our metabolic system operates at the rate it
        does for a reason – it is safe. Speeding up the metabolism may help burn
        off fat, but it can also trigger a number of potentially dangerous side
        effects, such as: fever.
  - source_sentence: what is the difference between 18 and 20 inch tires?
    sentences:
      - >-
        The only real difference is a 20" rim would be more likely to be
        damaged, as you pointed out. Beyond looks, there is zero benefit for the
        20" rim. Also, just the availability of tires will likely be much more
        limited for the larger rim. ... Tire selection is better for 18" wheels
        than 20" wheels.
      - >-
        ['Open your Outlook app on your mobile device and click on the Settings
        gear icon.', 'Under Settings, click on the Signature option.', 'Enter
        either a generic signature that could be used for all email accounts
        tied to your Outlook app, or a specific signature, Per Account
        Signature, for each email account.']
      - >-
        The average normal body temperature is around 98.6 degrees Fahrenheit,
        or 37 degrees Celsius. If your body temperature drops to just a few
        degrees lower than this, your blood vessels in your hands, feet, arms,
        and legs start to get narrower.
  - source_sentence: whom the bell tolls meaning?
    sentences:
      - >-
        Answer: Humans are depicted in Hindu art often in sensuous and erotic
        postures.
      - >-
        The phrase "For whom the bell tolls" refers to the church bells that are
        rung when a person dies. Hence, the author is suggesting that we should
        not be curious as to for whom the church bell is tolling for. It is for
        all of us.
      - '[''Automatically.'', ''When connected to car Bluetooth and,'', ''Manually.'']'
  - source_sentence: how long before chlamydia symptoms appear?
    sentences:
      - >-
        Most people who have chlamydia don't notice any symptoms. If you do get
        symptoms, these usually appear between 1 and 3 weeks after having
        unprotected sex with an infected person. For some people they don't
        develop until many months later. Sometimes the symptoms can disappear
        after a few days.
      - >-
        ['Open the My Verizon app . ... ', 'Tap the Menu icon. ... ', 'Tap
        Manage device for the appropriate mobile number. ... ', 'Tap Transfer
        content between phones. ... ', 'Tap Start Transfer.']
      - >-
        Psychiatrist vs Psychologist A psychiatrist is classed as a medical
        doctor, they include a physical examination of symptoms in their
        assessment and are able to prescribe medicine: a psychologist is also a
        doctor by virtue of their PHD level qualification, but is not medically
        trained and cannot prescribe.
  - source_sentence: are you human korean novela?
    sentences:
      - >-
        Many cysts heal on their own, which means that conservative treatments
        like rest and anti-inflammatory painkillers can often be enough to get
        rid of them. However, in some cases, routine drainage of the sac may be
        necessary to reduce symptoms.
      - >-
        A relative of European pear varieties like Bartlett and Anjou, the Asian
        pear is great used in recipes or simply eaten out of hand. It retains a
        crispness that works well in slaws and salads, and it holds its shape
        better than European pears when baked and cooked.
      - >-
        Are You Human? (Korean: 너도 인간이니; RR: Neodo Inganini; lit. Are You Human
        Too?) is a 2018 South Korean television series starring Seo Kang-jun and
        Gong Seung-yeon. It aired on KBS2's Mondays and Tuesdays at 22:00 (KST)
        time slot, from June 4 to August 7, 2018.
datasets:
  - sentence-transformers/gooaq
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
  - cosine_accuracy@1
  - cosine_accuracy@3
  - cosine_accuracy@5
  - cosine_accuracy@10
  - cosine_precision@1
  - cosine_precision@3
  - cosine_precision@5
  - cosine_precision@10
  - cosine_recall@1
  - cosine_recall@3
  - cosine_recall@5
  - cosine_recall@10
  - cosine_ndcg@10
  - cosine_mrr@10
  - cosine_map@100
model-index:
  - name: SentenceTransformer based on answerdotai/ModernBERT-base
    results:
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: NanoNQ
          type: NanoNQ
        metrics:
          - type: cosine_accuracy@1
            value: 0.38
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.64
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.7
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.8
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.38
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.22
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.14400000000000002
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.08199999999999999
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.36
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.62
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.67
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.74
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.5673854489333459
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.5237460317460316
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.5116785860647901
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: NanoMSMARCO
          type: NanoMSMARCO
        metrics:
          - type: cosine_accuracy@1
            value: 0.32
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.56
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.66
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.82
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.32
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.18666666666666665
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.132
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.08199999999999999
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.32
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.56
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.66
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.82
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.555381357077638
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.47249206349206346
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.4797949229011178
            name: Cosine Map@100
      - task:
          type: nano-beir
          name: Nano BEIR
        dataset:
          name: NanoBEIR mean
          type: NanoBEIR_mean
        metrics:
          - type: cosine_accuracy@1
            value: 0.35
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.6000000000000001
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.6799999999999999
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.81
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.35
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.2033333333333333
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.138
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.08199999999999999
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.33999999999999997
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.5900000000000001
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.665
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.78
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.5613834030054919
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.4981190476190476
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.49573675448295396
            name: Cosine Map@100

SentenceTransformer based on answerdotai/ModernBERT-base

This is a sentence-transformers model finetuned from answerdotai/ModernBERT-base on the gooaq dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

This model has been finetuned using train_st_gooaq.py using an RTX 3090, although only 10GB of VRAM was used.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: answerdotai/ModernBERT-base
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity
  • Training Dataset:
  • Language: en

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: ModernBertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("tomaarsen/ModernBERT-base-gooaq")
# Run inference
sentences = [
    'are you human korean novela?',
    "Are You Human? (Korean: 너도 인간이니; RR: Neodo Inganini; lit. Are You Human Too?) is a 2018 South Korean television series starring Seo Kang-jun and Gong Seung-yeon. It aired on KBS2's Mondays and Tuesdays at 22:00 (KST) time slot, from June 4 to August 7, 2018.",
    'A relative of European pear varieties like Bartlett and Anjou, the Asian pear is great used in recipes or simply eaten out of hand. It retains a crispness that works well in slaws and salads, and it holds its shape better than European pears when baked and cooked.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

Metric NanoNQ NanoMSMARCO
cosine_accuracy@1 0.38 0.32
cosine_accuracy@3 0.64 0.56
cosine_accuracy@5 0.7 0.66
cosine_accuracy@10 0.8 0.82
cosine_precision@1 0.38 0.32
cosine_precision@3 0.22 0.1867
cosine_precision@5 0.144 0.132
cosine_precision@10 0.082 0.082
cosine_recall@1 0.36 0.32
cosine_recall@3 0.62 0.56
cosine_recall@5 0.67 0.66
cosine_recall@10 0.74 0.82
cosine_ndcg@10 0.5674 0.5554
cosine_mrr@10 0.5237 0.4725
cosine_map@100 0.5117 0.4798

Nano BEIR

Metric Value
cosine_accuracy@1 0.35
cosine_accuracy@3 0.6
cosine_accuracy@5 0.68
cosine_accuracy@10 0.81
cosine_precision@1 0.35
cosine_precision@3 0.2033
cosine_precision@5 0.138
cosine_precision@10 0.082
cosine_recall@1 0.34
cosine_recall@3 0.59
cosine_recall@5 0.665
cosine_recall@10 0.78
cosine_ndcg@10 0.5614
cosine_mrr@10 0.4981
cosine_map@100 0.4957

Training Details

Training Dataset

gooaq

  • Dataset: gooaq at b089f72
  • Size: 3,012,496 training samples
  • Columns: question and answer
  • Approximate statistics based on the first 1000 samples:
    question answer
    type string string
    details
    • min: 8 tokens
    • mean: 12.0 tokens
    • max: 21 tokens
    • min: 15 tokens
    • mean: 58.17 tokens
    • max: 190 tokens
  • Samples:
    question answer
    what is the difference between clay and mud mask? The main difference between the two is that mud is a skin-healing agent, while clay is a cosmetic, drying agent. Clay masks are most useful for someone who has oily skin and is prone to breakouts of acne and blemishes.
    myki how much on card? A full fare myki card costs $6 and a concession, seniors or child myki costs $3. For more information about how to use your myki, visit ptv.vic.gov.au or call 1800 800 007.
    how to find out if someone blocked your phone number on iphone? If you get a notification like "Message Not Delivered" or you get no notification at all, that's a sign of a potential block. Next, you could try calling the person. If the call goes right to voicemail or rings once (or a half ring) then goes to voicemail, that's further evidence you may have been blocked.
  • Loss: CachedMultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Evaluation Dataset

gooaq

  • Dataset: gooaq at b089f72
  • Size: 3,012,496 evaluation samples
  • Columns: question and answer
  • Approximate statistics based on the first 1000 samples:
    question answer
    type string string
    details
    • min: 8 tokens
    • mean: 12.05 tokens
    • max: 21 tokens
    • min: 13 tokens
    • mean: 59.08 tokens
    • max: 116 tokens
  • Samples:
    question answer
    how do i program my directv remote with my tv? ['Press MENU on your remote.', 'Select Settings & Help > Settings > Remote Control > Program Remote.', 'Choose the device (TV, audio, DVD) you wish to program. ... ', 'Follow the on-screen prompts to complete programming.']
    are rodrigues fruit bats nocturnal? Before its numbers were threatened by habitat destruction, storms, and hunting, some of those groups could number 500 or more members. Sunrise, sunset. Rodrigues fruit bats are most active at dawn, at dusk, and at night.
    why does your heart rate increase during exercise bbc bitesize? During exercise there is an increase in physical activity and muscle cells respire more than they do when the body is at rest. The heart rate increases during exercise. The rate and depth of breathing increases - this makes sure that more oxygen is absorbed into the blood, and more carbon dioxide is removed from it.
  • Loss: CachedMultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 2048
  • per_device_eval_batch_size: 2048
  • learning_rate: 8e-05
  • num_train_epochs: 1
  • warmup_ratio: 0.05
  • bf16: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 2048
  • per_device_eval_batch_size: 2048
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 8e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.05
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss Validation Loss NanoNQ_cosine_ndcg@10 NanoMSMARCO_cosine_ndcg@10 NanoBEIR_mean_cosine_ndcg@10
0 0 - - 0.0388 0.0785 0.0587
0.0068 10 6.9066 - - - -
0.0136 20 4.853 - - - -
0.0204 30 2.5305 - - - -
0.0272 40 1.3877 - - - -
0.0340 50 0.871 0.3358 0.4385 0.4897 0.4641
0.0408 60 0.6463 - - - -
0.0476 70 0.5336 - - - -
0.0544 80 0.4601 - - - -
0.0612 90 0.4057 - - - -
0.0680 100 0.366 0.1523 0.5100 0.4477 0.4789
0.0748 110 0.3498 - - - -
0.0816 120 0.3297 - - - -
0.0884 130 0.3038 - - - -
0.0952 140 0.3062 - - - -
0.1020 150 0.2976 0.1176 0.5550 0.4742 0.5146
0.1088 160 0.2843 - - - -
0.1156 170 0.2732 - - - -
0.1224 180 0.2549 - - - -
0.1292 190 0.2584 - - - -
0.1360 200 0.2451 0.1018 0.5313 0.4846 0.5079
0.1428 210 0.2521 - - - -
0.1496 220 0.2451 - - - -
0.1564 230 0.2367 - - - -
0.1632 240 0.2359 - - - -
0.1700 250 0.2343 0.0947 0.5489 0.4823 0.5156
0.1768 260 0.2263 - - - -
0.1835 270 0.2225 - - - -
0.1903 280 0.2219 - - - -
0.1971 290 0.2136 - - - -
0.2039 300 0.2202 0.0932 0.5165 0.4674 0.4920
0.2107 310 0.2198 - - - -
0.2175 320 0.21 - - - -
0.2243 330 0.207 - - - -
0.2311 340 0.1972 - - - -
0.2379 350 0.2037 0.0877 0.5231 0.5039 0.5135
0.2447 360 0.2054 - - - -
0.2515 370 0.197 - - - -
0.2583 380 0.1922 - - - -
0.2651 390 0.1965 - - - -
0.2719 400 0.1962 0.0843 0.5409 0.4746 0.5078
0.2787 410 0.186 - - - -
0.2855 420 0.1911 - - - -
0.2923 430 0.1969 - - - -
0.2991 440 0.193 - - - -
0.3059 450 0.1912 0.0763 0.5398 0.5083 0.5241
0.3127 460 0.1819 - - - -
0.3195 470 0.1873 - - - -
0.3263 480 0.1899 - - - -
0.3331 490 0.1764 - - - -
0.3399 500 0.1828 0.0728 0.5439 0.5176 0.5308
0.3467 510 0.1753 - - - -
0.3535 520 0.1725 - - - -
0.3603 530 0.1758 - - - -
0.3671 540 0.183 - - - -
0.3739 550 0.1789 0.0733 0.5437 0.5185 0.5311
0.3807 560 0.1773 - - - -
0.3875 570 0.1764 - - - -
0.3943 580 0.1638 - - - -
0.4011 590 0.1809 - - - -
0.4079 600 0.1727 0.0700 0.5550 0.5021 0.5286
0.4147 610 0.1664 - - - -
0.4215 620 0.1683 - - - -
0.4283 630 0.1622 - - - -
0.4351 640 0.1592 - - - -
0.4419 650 0.168 0.0662 0.5576 0.4843 0.5210
0.4487 660 0.1696 - - - -
0.4555 670 0.1609 - - - -
0.4623 680 0.1644 - - - -
0.4691 690 0.1643 - - - -
0.4759 700 0.1604 0.0660 0.5605 0.5042 0.5323
0.4827 710 0.1634 - - - -
0.4895 720 0.1515 - - - -
0.4963 730 0.1592 - - - -
0.5031 740 0.1597 - - - -
0.5099 750 0.1617 0.0643 0.5576 0.4830 0.5203
0.5167 760 0.1512 - - - -
0.5235 770 0.1563 - - - -
0.5303 780 0.1529 - - - -
0.5370 790 0.1547 - - - -
0.5438 800 0.1548 0.0620 0.5538 0.5271 0.5405
0.5506 810 0.1533 - - - -
0.5574 820 0.1504 - - - -
0.5642 830 0.1489 - - - -
0.5710 840 0.1534 - - - -
0.5778 850 0.1507 0.0611 0.5697 0.5095 0.5396
0.5846 860 0.1475 - - - -
0.5914 870 0.1474 - - - -
0.5982 880 0.1499 - - - -
0.6050 890 0.1454 - - - -
0.6118 900 0.1419 0.0620 0.5586 0.5229 0.5407
0.6186 910 0.1465 - - - -
0.6254 920 0.1436 - - - -
0.6322 930 0.1464 - - - -
0.6390 940 0.1418 - - - -
0.6458 950 0.1443 0.0565 0.5627 0.5458 0.5543
0.6526 960 0.1458 - - - -
0.6594 970 0.1431 - - - -
0.6662 980 0.1417 - - - -
0.6730 990 0.1402 - - - -
0.6798 1000 0.1431 0.0563 0.5499 0.5366 0.5432
0.6866 1010 0.1386 - - - -
0.6934 1020 0.1413 - - - -
0.7002 1030 0.1381 - - - -
0.7070 1040 0.1364 - - - -
0.7138 1050 0.1346 0.0545 0.5574 0.5416 0.5495
0.7206 1060 0.1338 - - - -
0.7274 1070 0.1378 - - - -
0.7342 1080 0.135 - - - -
0.7410 1090 0.1336 - - - -
0.7478 1100 0.1393 0.0541 0.5776 0.5362 0.5569
0.7546 1110 0.1427 - - - -
0.7614 1120 0.1378 - - - -
0.7682 1130 0.1346 - - - -
0.7750 1140 0.1423 - - - -
0.7818 1150 0.1368 0.0525 0.5681 0.5237 0.5459
0.7886 1160 0.1392 - - - -
0.7954 1170 0.1321 - - - -
0.8022 1180 0.1387 - - - -
0.8090 1190 0.134 - - - -
0.8158 1200 0.1369 0.0515 0.5613 0.5416 0.5514
0.8226 1210 0.1358 - - - -
0.8294 1220 0.1401 - - - -
0.8362 1230 0.1334 - - - -
0.8430 1240 0.1331 - - - -
0.8498 1250 0.1324 0.0510 0.5463 0.5546 0.5505
0.8566 1260 0.135 - - - -
0.8634 1270 0.1367 - - - -
0.8702 1280 0.1356 - - - -
0.8770 1290 0.1291 - - - -
0.8838 1300 0.1313 0.0498 0.5787 0.5552 0.5670
0.8906 1310 0.1334 - - - -
0.8973 1320 0.1389 - - - -
0.9041 1330 0.1302 - - - -
0.9109 1340 0.1319 - - - -
0.9177 1350 0.1276 0.0504 0.5757 0.5575 0.5666
0.9245 1360 0.1355 - - - -
0.9313 1370 0.1289 - - - -
0.9381 1380 0.1335 - - - -
0.9449 1390 0.1298 - - - -
0.9517 1400 0.1279 0.0497 0.5743 0.5567 0.5655
0.9585 1410 0.1324 - - - -
0.9653 1420 0.1306 - - - -
0.9721 1430 0.1313 - - - -
0.9789 1440 0.135 - - - -
0.9857 1450 0.1293 0.0493 0.5671 0.5554 0.5612
0.9925 1460 0.133 - - - -
0.9993 1470 0.1213 - - - -
1.0 1471 - - 0.5674 0.5554 0.5614

Framework Versions

  • Python: 3.11.10
  • Sentence Transformers: 3.3.1
  • Transformers: 4.48.0.dev0
  • PyTorch: 2.6.0.dev20241112+cu121
  • Accelerate: 1.2.0
  • Datasets: 3.2.0
  • Tokenizers: 0.21.0

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CachedMultipleNegativesRankingLoss

@misc{gao2021scaling,
    title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup},
    author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
    year={2021},
    eprint={2101.06983},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}