Sentence Similarity
sentence-transformers
Safetensors
mpnet
feature-extraction
Generated from Trainer
dataset_size:500431
loss:SoftmaxLoss
loss:MultipleNegativesRankingLoss
Inference Endpoints

SentenceTransformer based on sentence-transformers/all-mpnet-base-v2

This is a sentence-transformers model finetuned from sentence-transformers/all-mpnet-base-v2 on the our own created datasets. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 384, 'do_lower_case': False}) with Transformer model: MPNetModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("MasterControlAIML/finetuned-ecfr-embeddings")
# Run inference
sentences = [
    'The coordinator serves as the main point of contact between DEA registered locations and the CSOS Certification Authority for issues pertaining to issuance, revocation, and changes to digital certificates.',
    'The role of a CSOS coordinator involves issuing and distributing new digital certificates to DEA registered locations.',
    'The rules only apply to paper orders and prescriptions, not electronic ones.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Dataset

https://huggingface.co./datasets/MasterControlAIML/triplet_finetuning_embeddings.csv

https://huggingface.co./datasets/MasterControlAIML/pair_class_finetuning_embeddings.csv

https://huggingface.co./datasets/MasterControlAIML/question_answer_finetuning_embeddings.csv

  • Dataset: MasterControlAIML/pair_class_finetuning_embeddings.csv
  • Size: 300,000 training samples
  • Columns: sentence_a, sentence_b, and label
  • Approximate statistics based on the first 1000 samples:
    sentence_a sentence_b label
    type string string int
    details
    • min: 9 tokens
    • mean: 29.84 tokens
    • max: 164 tokens
    • min: 4 tokens
    • mean: 25.59 tokens
    • max: 73 tokens
    • 0: ~33.30%
    • 1: ~33.30%
    • 2: ~33.40%
  • Samples:
    sentence_a sentence_b label
    The purpose is to ensure consistency in the interpretation of terms used across different parts of the Federal Food, Drug, and Cosmetic Act and its regulations. This provision aims to maintain uniformity in how terms are understood throughout the legislation and related rules. 0
    The purpose is to ensure consistency in the interpretation of terms used across different parts of the Federal Food, Drug, and Cosmetic Act and its regulations. The goal is to make sure that different interpretations of terminology vary widely across the Federal Food, Drug, and Cosmetic Act and its regulations. 2
    The purpose is to ensure consistency in the interpretation of terms used across different parts of the Federal Food, Drug, and Cosmetic Act and its regulations. The intention behind this is to guarantee that similar terms are interpreted consistently within the statute and its implementing regulations. 1
  • Loss: SoftmaxLoss

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • learning_rate: 3e-05
  • weight_decay: 0.01
  • warmup_ratio: 0.1
  • bf16: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 3e-05
  • weight_decay: 0.01
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 3
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss triplet loss natural-questions loss pair-class loss
0.0026 10 1.1707 - - -
0.0051 20 1.4398 - - -
0.0077 30 1.1177 - - -
0.0102 40 1.337 - - -
0.0128 50 1.3094 - - -
0.0153 60 1.3527 - - -
0.0179 70 1.0938 - - -
0.0205 80 1.1795 - - -
0.0230 90 1.1525 - - -
0.0256 100 1.1584 - - -
0.0281 110 1.0561 - - -
0.0307 120 1.0497 - - -
0.0332 130 1.0765 - - -
0.0358 140 1.2655 - - -
0.0384 150 1.0642 - - -
0.0409 160 1.2373 - - -
0.0435 170 1.1048 - - -
0.0460 180 1.0555 - - -
0.0486 190 1.0665 - - -
0.0511 200 1.0416 - - -
0.0537 210 1.1101 - - -
0.0563 220 0.9804 - - -
0.0588 230 1.0349 - - -
0.0614 240 1.0572 - - -
0.0639 250 0.9677 - - -
0.0665 260 1.3367 - - -
0.0690 270 1.1872 - - -
0.0716 280 1.3214 - - -
0.0741 290 1.1236 - - -
0.0767 300 0.9422 - - -
0.0793 310 1.2298 - - -
0.0818 320 1.0567 - - -
0.0844 330 1.1267 - - -
0.0869 340 1.1045 - - -
0.0895 350 1.1986 - - -
0.0920 360 1.2155 - - -
0.0946 370 1.1971 - - -
0.0972 380 1.2324 - - -
0.0997 390 1.1244 - - -
0.1023 400 1.1597 - - -
0.1048 410 1.0963 - - -
0.1074 420 1.0303 - - -
0.1099 430 1.0667 - - -
0.1125 440 1.0421 - - -
0.1151 450 1.2969 - - -
0.1176 460 1.0174 - - -
0.1202 470 1.1774 - - -
0.1227 480 1.1024 - - -
0.1253 490 1.2135 - - -
0.1278 500 1.0365 - - -
0.1304 510 1.2741 - - -
0.1330 520 1.0621 - - -
0.1355 530 1.1155 - - -
0.1381 540 0.9648 - - -
0.1406 550 1.2931 - - -
0.1432 560 1.0685 - - -
0.1457 570 1.0907 - - -
0.1483 580 1.4771 - - -
0.1509 590 1.2467 - - -
0.1534 600 1.3106 - - -
0.1560 610 1.0369 - - -
0.1585 620 1.257 - - -
0.1611 630 1.5544 - - -
0.1636 640 1.2573 - - -
0.1662 650 1.2558 - - -
0.1688 660 1.4195 - - -
0.1713 670 1.2734 - - -
0.1739 680 1.3665 - - -
0.1764 690 1.2314 - - -
0.1790 700 1.6973 - - -
0.1815 710 1.5824 - - -
0.1841 720 1.3263 - - -
0.1867 730 1.18 - - -
0.1892 740 1.2197 - - -
0.1918 750 2.0825 - - -
0.1943 760 1.146 - - -
0.1969 770 1.0494 - - -
0.1994 780 1.6212 - - -
0.2020 790 1.2458 - - -
0.2046 800 0.9586 - - -
0.2071 810 1.3012 - - -
0.2097 820 1.3227 - - -
0.2122 830 1.2995 - - -
0.2148 840 1.4382 - - -
0.2173 850 0.9914 - - -
0.2199 860 1.3789 - - -
0.2224 870 0.78 - - -
0.2250 880 2.0291 - - -
0.2276 890 1.2744 - - -
0.2301 900 1.5524 - - -
0.2327 910 1.1003 - - -
0.2352 920 1.2761 - - -
0.2378 930 1.0052 - - -
0.2403 940 1.4054 - - -
0.2429 950 1.3834 - - -
0.2455 960 1.1076 - - -
0.2480 970 1.3723 - - -
0.2506 980 1.1703 - - -
0.2531 990 1.2798 - - -
0.2557 1000 1.0384 1.9672 1.5738 0.7390
0.2582 1010 1.0659 - - -
0.2608 1020 0.786 - - -
0.2634 1030 1.1735 - - -
0.2659 1040 1.6015 - - -
0.2685 1050 1.1714 - - -
0.2710 1060 0.9593 - - -
0.2736 1070 0.9219 - - -
0.2761 1080 1.2385 - - -
0.2787 1090 1.4025 - - -
0.2813 1100 1.0528 - - -
0.2838 1110 1.0912 - - -
0.2864 1120 1.0755 - - -
0.2889 1130 0.8722 - - -
0.2915 1140 0.8613 - - -
0.2940 1150 0.714 - - -
0.2966 1160 0.9849 - - -
0.2992 1170 1.5536 - - -
0.3017 1180 0.8425 - - -
0.3043 1190 1.0021 - - -
0.3068 1200 0.7547 - - -
0.3094 1210 1.1071 - - -
0.3119 1220 0.8453 - - -
0.3145 1230 0.8416 - - -
0.3171 1240 1.0041 - - -
0.3196 1250 1.1317 - - -
0.3222 1260 0.9105 - - -
0.3247 1270 0.8718 - - -
0.3273 1280 1.0268 - - -
0.3298 1290 1.0069 - - -
0.3324 1300 1.0736 - - -
0.3350 1310 0.8386 - - -
0.3375 1320 0.7839 - - -
0.3401 1330 0.8844 - - -
0.3426 1340 1.2852 - - -
0.3452 1350 0.8269 - - -
0.3477 1360 0.8158 - - -
0.3503 1370 0.8811 - - -
0.3529 1380 0.8127 - - -
0.3554 1390 0.9863 - - -
0.3580 1400 0.9983 - - -
0.3605 1410 1.0913 - - -
0.3631 1420 0.8338 - - -
0.3656 1430 0.7213 - - -
0.3682 1440 1.0293 - - -
0.3707 1450 0.8337 - - -
0.3733 1460 0.9256 - - -
0.3759 1470 1.0214 - - -
0.3784 1480 0.8967 - - -
0.3810 1490 0.8052 - - -
0.3835 1500 0.9588 - - -
0.3861 1510 0.9167 - - -
0.3886 1520 1.098 - - -
0.3912 1530 0.8574 - - -
0.3938 1540 0.7859 - - -
0.3963 1550 0.7916 - - -
0.3989 1560 0.7334 - - -
0.4014 1570 0.7245 - - -
0.4040 1580 0.6274 - - -
0.4065 1590 0.98 - - -
0.4091 1600 0.8322 - - -
0.4117 1610 0.7733 - - -
0.4142 1620 0.9476 - - -
0.4168 1630 0.9175 - - -
0.4193 1640 0.9388 - - -
0.4219 1650 0.8695 - - -
0.4244 1660 0.7446 - - -
0.4270 1670 0.7037 - - -
0.4296 1680 0.857 - - -
0.4321 1690 0.8691 - - -
0.4347 1700 1.2859 - - -
0.4372 1710 0.8383 - - -
0.4398 1720 1.0063 - - -
0.4423 1730 0.9503 - - -
0.4449 1740 0.9286 - - -
0.4475 1750 0.7624 - - -
0.4500 1760 1.0289 - - -
0.4526 1770 0.8778 - - -
0.4551 1780 1.0507 - - -
0.4577 1790 0.8792 - - -
0.4602 1800 0.564 - - -
0.4628 1810 0.8385 - - -
0.4654 1820 0.7852 - - -
0.4679 1830 1.096 - - -
0.4705 1840 1.059 - - -
0.4730 1850 0.7789 - - -
0.4756 1860 0.7415 - - -
0.4781 1870 0.726 - - -
0.4807 1880 0.7858 - - -
0.4833 1890 0.6238 - - -
0.4858 1900 1.0395 - - -
0.4884 1910 0.7295 - - -
0.4909 1920 0.9215 - - -
0.4935 1930 0.7803 - - -
0.4960 1940 0.6647 - - -
0.4986 1950 0.8575 - - -
0.5012 1960 0.7408 - - -
0.5037 1970 0.772 - - -
0.5063 1980 0.6131 - - -
0.5088 1990 0.8394 - - -
0.5114 2000 0.7548 1.2322 1.2839 0.5386
0.5139 2010 1.0089 - - -
0.5165 2020 0.8726 - - -
0.5190 2030 1.0068 - - -
0.5216 2040 0.6911 - - -
0.5242 2050 0.6863 - - -
0.5267 2060 0.684 - - -
0.5293 2070 0.6317 - - -
0.5318 2080 0.8134 - - -
0.5344 2090 0.7953 - - -
0.5369 2100 0.7435 - - -
0.5395 2110 0.7763 - - -
0.5421 2120 0.7817 - - -
0.5446 2130 0.7257 - - -
0.5472 2140 0.7699 - - -
0.5497 2150 0.7071 - - -
0.5523 2160 0.6143 - - -
0.5548 2170 0.8661 - - -
0.5574 2180 0.7011 - - -
0.5600 2190 0.6527 - - -
0.5625 2200 0.7136 - - -
0.5651 2210 0.7573 - - -
0.5676 2220 0.7446 - - -
0.5702 2230 0.7281 - - -
0.5727 2240 0.6477 - - -
0.5753 2250 0.6089 - - -
0.5779 2260 0.9791 - - -
0.5804 2270 0.6604 - - -
0.5830 2280 0.616 - - -
0.5855 2290 0.7537 - - -
0.5881 2300 0.8455 - - -
0.5906 2310 0.7813 - - -
0.5932 2320 0.7018 - - -
0.5958 2330 0.8648 - - -
0.5983 2340 0.6803 - - -
0.6009 2350 0.6801 - - -
0.6034 2360 0.8162 - - -
0.6060 2370 0.8988 - - -
0.6085 2380 0.7448 - - -
0.6111 2390 0.6624 - - -
0.6137 2400 0.7872 - - -
0.6162 2410 0.5589 - - -
0.6188 2420 0.8243 - - -
0.6213 2430 0.6335 - - -
0.6239 2440 0.7356 - - -
0.6264 2450 0.5682 - - -
0.6290 2460 0.662 - - -
0.6316 2470 0.6618 - - -
0.6341 2480 0.6632 - - -
0.6367 2490 0.6072 - - -
0.6392 2500 0.6689 - - -
0.6418 2510 0.49 - - -
0.6443 2520 0.5977 - - -
0.6469 2530 0.831 - - -
0.6495 2540 0.8773 - - -
0.6520 2550 0.6686 - - -
0.6546 2560 0.5799 - - -
0.6571 2570 0.6255 - - -
0.6597 2580 0.6272 - - -
0.6622 2590 0.672 - - -
0.6648 2600 0.7736 - - -
0.6673 2610 0.561 - - -
0.6699 2620 0.4959 - - -
0.6725 2630 0.6237 - - -
0.6750 2640 0.6684 - - -
0.6776 2650 0.6777 - - -
0.6801 2660 0.5853 - - -
0.6827 2670 0.6338 - - -
0.6852 2680 0.4402 - - -
0.6878 2690 0.7279 - - -
0.6904 2700 0.6405 - - -
0.6929 2710 0.6669 - - -
0.6955 2720 0.6239 - - -
0.6980 2730 0.6889 - - -
0.7006 2740 0.6653 - - -
0.7031 2750 0.5996 - - -
0.7057 2760 0.6477 - - -
0.7083 2770 0.6088 - - -
0.7108 2780 0.5685 - - -
0.7134 2790 0.7533 - - -
0.7159 2800 0.7903 - - -
0.7185 2810 0.4748 - - -
0.7210 2820 0.5296 - - -
0.7236 2830 0.6541 - - -
0.7262 2840 0.5332 - - -
0.7287 2850 0.4696 - - -
0.7313 2860 0.5556 - - -
0.7338 2870 0.5828 - - -
0.7364 2880 0.5641 - - -
0.7389 2890 0.5416 - - -
0.7415 2900 0.4942 - - -
0.7441 2910 0.7297 - - -
0.7466 2920 0.5595 - - -
0.7492 2930 0.6382 - - -
0.7517 2940 0.6296 - - -
0.7543 2950 0.5557 - - -
0.7568 2960 0.5127 - - -
0.7594 2970 0.6284 - - -
0.7620 2980 0.5141 - - -
0.7645 2990 0.6558 - - -
0.7671 3000 0.5548 0.9845 1.2546 0.5065
0.7696 3010 0.5806 - - -
0.7722 3020 0.7214 - - -
0.7747 3030 0.5564 - - -
0.7773 3040 0.4904 - - -
0.7799 3050 0.6085 - - -
0.7824 3060 0.5571 - - -
0.7850 3070 0.5072 - - -
0.7875 3080 0.5178 - - -
0.7901 3090 0.5979 - - -
0.7926 3100 0.5441 - - -
0.7952 3110 0.5605 - - -
0.7977 3120 0.5547 - - -
0.8003 3130 0.6522 - - -
0.8029 3140 0.6491 - - -
0.8054 3150 0.6327 - - -
0.8080 3160 0.4649 - - -
0.8105 3170 0.5041 - - -
0.8131 3180 0.5694 - - -
0.8156 3190 0.5865 - - -
0.8182 3200 0.6403 - - -
0.8208 3210 0.6079 - - -
0.8233 3220 0.5725 - - -
0.8259 3230 0.5179 - - -
0.8284 3240 0.5256 - - -
0.8310 3250 0.4747 - - -
0.8335 3260 0.5917 - - -
0.8361 3270 0.5964 - - -
0.8387 3280 0.446 - - -
0.8412 3290 0.7388 - - -
0.8438 3300 0.7815 - - -
0.8463 3310 0.5636 - - -
0.8489 3320 0.67 - - -
0.8514 3330 0.5708 - - -
0.8540 3340 0.657 - - -
0.8566 3350 0.6734 - - -
0.8591 3360 0.6354 - - -
0.8617 3370 0.6066 - - -
0.8642 3380 0.6637 - - -
0.8668 3390 0.8015 - - -
0.8693 3400 0.7256 - - -
0.8719 3410 0.5928 - - -
0.8745 3420 0.6384 - - -
0.8770 3430 0.6484 - - -
0.8796 3440 0.616 - - -
0.8821 3450 0.8358 - - -
0.8847 3460 0.5322 - - -
0.8872 3470 0.7502 - - -
0.8898 3480 0.5876 - - -
0.8924 3490 0.747 - - -
0.8949 3500 0.954 - - -
0.8975 3510 0.4548 - - -
0.9000 3520 0.5293 - - -
0.9026 3530 0.5781 - - -
0.9051 3540 0.6266 - - -
0.9077 3550 0.5068 - - -
0.9103 3560 0.6532 - - -
0.9128 3570 0.521 - - -
0.9154 3580 0.5661 - - -
0.9179 3590 0.5974 - - -
0.9205 3600 0.936 - - -
0.9230 3610 0.6881 - - -
0.9256 3620 0.648 - - -
0.9282 3630 0.5713 - - -
0.9307 3640 0.7123 - - -
0.9333 3650 0.5412 - - -
0.9358 3660 0.6557 - - -
0.9384 3670 0.6858 - - -
0.9409 3680 0.5731 - - -
0.9435 3690 0.6859 - - -
0.9460 3700 0.5787 - - -
0.9486 3710 0.6479 - - -
0.9512 3720 0.809 - - -
0.9537 3730 0.6127 - - -
0.9563 3740 0.8344 - - -
0.9588 3750 0.6894 - - -
0.9614 3760 0.5469 - - -
0.9639 3770 0.6216 - - -
0.9665 3780 0.6415 - - -
0.9691 3790 0.6896 - - -
0.9716 3800 0.7863 - - -
0.9742 3810 1.0274 - - -
0.9767 3820 0.6791 - - -
0.9793 3830 0.442 - - -
0.9818 3840 0.8335 - - -
0.9844 3850 0.5654 - - -
0.9870 3860 0.6092 - - -
0.9895 3870 0.6547 - - -
0.9921 3880 0.6271 - - -
0.9946 3890 0.7524 - - -
0.9972 3900 0.7714 - - -
0.9997 3910 0.5588 - - -
1.0023 3920 0.6197 - - -
1.0049 3930 0.6721 - - -
1.0074 3940 0.4887 - - -
1.0100 3950 0.5616 - - -
1.0125 3960 0.6064 - - -
1.0151 3970 0.5886 - - -
1.0176 3980 0.4971 - - -
1.0202 3990 0.451 - - -
1.0228 4000 0.4847 0.7547 1.1616 0.4379
1.0253 4010 0.4582 - - -
1.0279 4020 0.4161 - - -
1.0304 4030 0.4513 - - -
1.0330 4040 0.4289 - - -
1.0355 4050 0.5468 - - -
1.0381 4060 0.4495 - - -
1.0407 4070 0.5929 - - -
1.0432 4080 0.4748 - - -
1.0458 4090 0.4239 - - -
1.0483 4100 0.5096 - - -
1.0509 4110 0.4209 - - -
1.0534 4120 0.5346 - - -
1.0560 4130 0.54 - - -
1.0586 4140 0.4417 - - -
1.0611 4150 0.5158 - - -
1.0637 4160 0.4592 - - -
1.0662 4170 0.6933 - - -
1.0688 4180 0.5689 - - -
1.0713 4190 0.6421 - - -
1.0739 4200 0.546 - - -
1.0765 4210 0.4952 - - -
1.0790 4220 0.6514 - - -
1.0816 4230 0.6293 - - -
1.0841 4240 0.5825 - - -
1.0867 4250 0.584 - - -
1.0892 4260 0.6204 - - -
1.0918 4270 0.6902 - - -
1.0943 4280 0.6685 - - -
1.0969 4290 0.6813 - - -
1.0995 4300 0.6038 - - -
1.1020 4310 0.6772 - - -
1.1046 4320 0.6142 - - -
1.1071 4330 0.5894 - - -
1.1097 4340 0.5789 - - -
1.1122 4350 0.4491 - - -
1.1148 4360 0.5892 - - -
1.1174 4370 0.5034 - - -
1.1199 4380 0.5833 - - -
1.1225 4390 0.6176 - - -
1.1250 4400 0.6472 - - -
1.1276 4410 0.5534 - - -
1.1301 4420 0.7378 - - -
1.1327 4430 0.5563 - - -
1.1353 4440 0.6564 - - -
1.1378 4450 0.5604 - - -
1.1404 4460 0.5094 - - -
1.1429 4470 0.5543 - - -
1.1455 4480 0.4229 - - -
1.1480 4490 0.695 - - -
1.1506 4500 0.5899 - - -
1.1532 4510 0.6987 - - -
1.1557 4520 0.5547 - - -
1.1583 4530 0.5025 - - -
1.1608 4540 0.9436 - - -
1.1634 4550 0.7775 - - -
1.1659 4560 0.6142 - - -
1.1685 4570 0.8614 - - -
1.1711 4580 0.6176 - - -
1.1736 4590 0.6838 - - -
1.1762 4600 0.6569 - - -
1.1787 4610 0.8706 - - -
1.1813 4620 0.8677 - - -
1.1838 4630 0.5774 - - -
1.1864 4640 0.4048 - - -
1.1890 4650 0.4929 - - -
1.1915 4660 0.5701 - - -
1.1941 4670 0.5625 - - -
1.1966 4680 0.415 - - -
1.1992 4690 0.5585 - - -
1.2017 4700 0.7766 - - -
1.2043 4710 0.4176 - - -
1.2069 4720 0.5283 - - -
1.2094 4730 0.5598 - - -
1.2120 4740 0.6946 - - -
1.2145 4750 0.4879 - - -
1.2171 4760 0.5872 - - -
1.2196 4770 0.6603 - - -
1.2222 4780 0.4019 - - -
1.2248 4790 0.5007 - - -
1.2273 4800 0.4738 - - -
1.2299 4810 0.7269 - - -
1.2324 4820 0.6355 - - -
1.2350 4830 0.6972 - - -
1.2375 4840 0.4684 - - -
1.2401 4850 0.674 - - -
1.2426 4860 0.6338 - - -
1.2452 4870 0.6925 - - -
1.2478 4880 0.7255 - - -
1.2503 4890 0.6693 - - -
1.2529 4900 0.5574 - - -
1.2554 4910 0.5629 - - -
1.2580 4920 0.5522 - - -
1.2605 4930 0.3723 - - -
1.2631 4940 0.4743 - - -
1.2657 4950 0.4633 - - -
1.2682 4960 0.6679 - - -
1.2708 4970 0.4972 - - -
1.2733 4980 0.4702 - - -
1.2759 4990 0.6623 - - -
1.2784 5000 0.6882 0.7464 1.0944 0.4265
1.2810 5010 0.6142 - - -
1.2836 5020 0.5163 - - -
1.2861 5030 0.66 - - -
1.2887 5040 0.5061 - - -
1.2912 5050 0.4545 - - -
1.2938 5060 0.4068 - - -
1.2963 5070 0.4095 - - -
1.2989 5080 0.7697 - - -
1.3015 5090 0.3916 - - -
1.3040 5100 0.5554 - - -
1.3066 5110 0.4539 - - -
1.3091 5120 0.5223 - - -
1.3117 5130 0.4381 - - -
1.3142 5140 0.3755 - - -
1.3168 5150 0.5873 - - -
1.3194 5160 0.6344 - - -
1.3219 5170 0.4914 - - -
1.3245 5180 0.4993 - - -
1.3270 5190 0.3972 - - -
1.3296 5200 0.5226 - - -
1.3321 5210 0.6332 - - -
1.3347 5220 0.4855 - - -
1.3373 5230 0.4543 - - -
1.3398 5240 0.4119 - - -
1.3424 5250 0.4983 - - -
1.3449 5260 0.5639 - - -
1.3475 5270 0.3813 - - -
1.3500 5280 0.4136 - - -
1.3526 5290 0.5028 - - -
1.3552 5300 0.5798 - - -
1.3577 5310 0.5341 - - -
1.3603 5320 0.6453 - - -
1.3628 5330 0.5209 - - -
1.3654 5340 0.466 - - -
1.3679 5350 0.5716 - - -
1.3705 5360 0.4509 - - -
1.3731 5370 0.4874 - - -
1.3756 5380 0.5475 - - -
1.3782 5390 0.5323 - - -
1.3807 5400 0.4873 - - -
1.3833 5410 0.5773 - - -
1.3858 5420 0.4543 - - -
1.3884 5430 0.5811 - - -
1.3909 5440 0.5467 - - -
1.3935 5450 0.4542 - - -
1.3961 5460 0.4626 - - -
1.3986 5470 0.4366 - - -
1.4012 5480 0.4025 - - -
1.4037 5490 0.432 - - -
1.4063 5500 0.5241 - - -
1.4088 5510 0.527 - - -
1.4114 5520 0.4462 - - -
1.4140 5530 0.6555 - - -
1.4165 5540 0.5221 - - -
1.4191 5550 0.658 - - -
1.4216 5560 0.5444 - - -
1.4242 5570 0.4359 - - -
1.4267 5580 0.4461 - - -
1.4293 5590 0.4948 - - -
1.4319 5600 0.5505 - - -
1.4344 5610 0.7858 - - -
1.4370 5620 0.6252 - - -
1.4395 5630 0.6586 - - -
1.4421 5640 0.518 - - -
1.4446 5650 0.5964 - - -
1.4472 5660 0.5859 - - -
1.4498 5670 0.7199 - - -
1.4523 5680 0.4844 - - -
1.4549 5690 0.7428 - - -
1.4574 5700 0.6263 - - -
1.4600 5710 0.3484 - - -
1.4625 5720 0.4461 - - -
1.4651 5730 0.5068 - - -
1.4677 5740 0.7243 - - -
1.4702 5750 0.7431 - - -
1.4728 5760 0.4467 - - -
1.4753 5770 0.4034 - - -
1.4779 5780 0.3927 - - -
1.4804 5790 0.6301 - - -
1.4830 5800 0.4042 - - -
1.4856 5810 0.5382 - - -
1.4881 5820 0.4954 - - -
1.4907 5830 0.4905 - - -
1.4932 5840 0.4893 - - -
1.4958 5850 0.41 - - -
1.4983 5860 0.4864 - - -
1.5009 5870 0.4895 - - -
1.5035 5880 0.4969 - - -
1.5060 5890 0.3876 - - -
1.5086 5900 0.4411 - - -
1.5111 5910 0.5686 - - -
1.5137 5920 0.5737 - - -
1.5162 5930 0.547 - - -
1.5188 5940 0.7115 - - -
1.5214 5950 0.4751 - - -
1.5239 5960 0.4127 - - -
1.5265 5970 0.4453 - - -
1.5290 5980 0.4253 - - -
1.5316 5990 0.5254 - - -
1.5341 6000 0.5436 0.6390 1.0603 0.3939
1.5367 6010 0.4197 - - -
1.5392 6020 0.534 - - -
1.5418 6030 0.4965 - - -
1.5444 6040 0.4878 - - -
1.5469 6050 0.4294 - - -
1.5495 6060 0.4437 - - -
1.5520 6070 0.3917 - - -
1.5546 6080 0.564 - - -
1.5571 6090 0.4542 - - -
1.5597 6100 0.421 - - -
1.5623 6110 0.4846 - - -
1.5648 6120 0.4944 - - -
1.5674 6130 0.4726 - - -
1.5699 6140 0.496 - - -
1.5725 6150 0.4234 - - -
1.5750 6160 0.3752 - - -
1.5776 6170 0.6108 - - -
1.5802 6180 0.4439 - - -
1.5827 6190 0.4217 - - -
1.5853 6200 0.4975 - - -
1.5878 6210 0.5268 - - -
1.5904 6220 0.5703 - - -
1.5929 6230 0.4639 - - -
1.5955 6240 0.5542 - - -
1.5981 6250 0.4998 - - -
1.6006 6260 0.44 - - -
1.6032 6270 0.5566 - - -
1.6057 6280 0.5598 - - -
1.6083 6290 0.4352 - - -
1.6108 6300 0.4595 - - -
1.6134 6310 0.5487 - - -
1.6160 6320 0.3698 - - -
1.6185 6330 0.567 - - -
1.6211 6340 0.4251 - - -
1.6236 6350 0.4386 - - -
1.6262 6360 0.3881 - - -
1.6287 6370 0.4287 - - -
1.6313 6380 0.407 - - -
1.6339 6390 0.4974 - - -
1.6364 6400 0.4002 - - -
1.6390 6410 0.4267 - - -
1.6415 6420 0.323 - - -
1.6441 6430 0.3948 - - -
1.6466 6440 0.5634 - - -
1.6492 6450 0.5853 - - -
1.6518 6460 0.4512 - - -
1.6543 6470 0.4345 - - -
1.6569 6480 0.3606 - - -
1.6594 6490 0.4674 - - -
1.6620 6500 0.4098 - - -
1.6645 6510 0.6015 - - -
1.6671 6520 0.3807 - - -
1.6696 6530 0.3343 - - -
1.6722 6540 0.4308 - - -
1.6748 6550 0.3871 - - -
1.6773 6560 0.4281 - - -
1.6799 6570 0.4097 - - -
1.6824 6580 0.4772 - - -
1.6850 6590 0.3188 - - -
1.6875 6600 0.449 - - -
1.6901 6610 0.3407 - - -
1.6927 6620 0.5092 - - -
1.6952 6630 0.4524 - - -
1.6978 6640 0.4458 - - -
1.7003 6650 0.4102 - - -
1.7029 6660 0.4368 - - -
1.7054 6670 0.417 - - -
1.7080 6680 0.4124 - - -
1.7106 6690 0.4035 - - -
1.7131 6700 0.4384 - - -
1.7157 6710 0.5228 - - -
1.7182 6720 0.3778 - - -
1.7208 6730 0.368 - - -
1.7233 6740 0.3979 - - -
1.7259 6750 0.3965 - - -
1.7285 6760 0.3225 - - -
1.7310 6770 0.3861 - - -
1.7336 6780 0.3819 - - -
1.7361 6790 0.3794 - - -
1.7387 6800 0.3851 - - -
1.7412 6810 0.3215 - - -
1.7438 6820 0.4943 - - -
1.7464 6830 0.3747 - - -
1.7489 6840 0.4361 - - -
1.7515 6850 0.4372 - - -
1.7540 6860 0.3805 - - -
1.7566 6870 0.3455 - - -
1.7591 6880 0.4322 - - -
1.7617 6890 0.3251 - - -
1.7643 6900 0.4783 - - -
1.7668 6910 0.3606 - - -
1.7694 6920 0.4204 - - -
1.7719 6930 0.4985 - - -
1.7745 6940 0.3783 - - -
1.7770 6950 0.3711 - - -
1.7796 6960 0.4081 - - -
1.7822 6970 0.4105 - - -
1.7847 6980 0.3585 - - -
1.7873 6990 0.3371 - - -
1.7898 7000 0.384 0.6367 1.1045 0.4125
1.7924 7010 0.4205 - - -
1.7949 7020 0.3591 - - -
1.7975 7030 0.4315 - - -
1.8001 7040 0.4607 - - -
1.8026 7050 0.4628 - - -
1.8052 7060 0.407 - - -
1.8077 7070 0.3853 - - -
1.8103 7080 0.3309 - - -
1.8128 7090 0.4583 - - -
1.8154 7100 0.3469 - - -
1.8179 7110 0.4405 - - -
1.8205 7120 0.488 - - -
1.8231 7130 0.3955 - - -
1.8256 7140 0.4108 - - -
1.8282 7150 0.3444 - - -
1.8307 7160 0.3279 - - -
1.8333 7170 0.4007 - - -
1.8358 7180 0.4486 - - -
1.8384 7190 0.3288 - - -
1.8410 7200 0.5559 - - -
1.8435 7210 0.5456 - - -
1.8461 7220 0.3915 - - -
1.8486 7230 0.4971 - - -
1.8512 7240 0.363 - - -
1.8537 7250 0.4358 - - -
1.8563 7260 0.5693 - - -
1.8589 7270 0.4443 - - -
1.8614 7280 0.4407 - - -
1.8640 7290 0.4424 - - -
1.8665 7300 0.5502 - - -
1.8691 7310 0.4879 - - -
1.8716 7320 0.5395 - - -
1.8742 7330 0.4694 - - -
1.8768 7340 0.4322 - - -
1.8793 7350 0.4352 - - -
1.8819 7360 0.5727 - - -
1.8844 7370 0.4332 - - -
1.8870 7380 0.4912 - - -
1.8895 7390 0.4102 - - -
1.8921 7400 0.5104 - - -
1.8947 7410 0.7786 - - -
1.8972 7420 0.3641 - - -
1.8998 7430 0.3758 - - -
1.9023 7440 0.4227 - - -
1.9049 7450 0.4766 - - -
1.9074 7460 0.377 - - -
1.9100 7470 0.4256 - - -
1.9126 7480 0.3535 - - -
1.9151 7490 0.3973 - - -
1.9177 7500 0.4426 - - -
1.9202 7510 0.6638 - - -
1.9228 7520 0.4755 - - -
1.9253 7530 0.4941 - - -
1.9279 7540 0.4356 - - -
1.9305 7550 0.517 - - -
1.9330 7560 0.3946 - - -
1.9356 7570 0.5079 - - -
1.9381 7580 0.4621 - - -
1.9407 7590 0.4411 - - -
1.9432 7600 0.4606 - - -
1.9458 7610 0.4993 - - -
1.9484 7620 0.4515 - - -
1.9509 7630 0.5787 - - -
1.9535 7640 0.5409 - - -
1.9560 7650 0.5766 - - -
1.9586 7660 0.5337 - - -
1.9611 7670 0.4367 - - -
1.9637 7680 0.4695 - - -
1.9662 7690 0.4955 - - -
1.9688 7700 0.4655 - - -
1.9714 7710 0.6728 - - -
1.9739 7720 0.7257 - - -
1.9765 7730 0.6987 - - -
1.9790 7740 0.3387 - - -
1.9816 7750 0.6056 - - -
1.9841 7760 0.4411 - - -
1.9867 7770 0.4848 - - -
1.9893 7780 0.4467 - - -
1.9918 7790 0.5241 - - -
1.9944 7800 0.5792 - - -
1.9969 7810 0.5933 - - -
1.9995 7820 0.4172 - - -
2.0020 7830 0.4658 - - -
2.0046 7840 0.4805 - - -
2.0072 7850 0.3559 - - -
2.0097 7860 0.4412 - - -
2.0123 7870 0.4594 - - -
2.0148 7880 0.42 - - -
2.0174 7890 0.4097 - - -
2.0199 7900 0.349 - - -
2.0225 7910 0.3457 - - -
2.0251 7920 0.3435 - - -
2.0276 7930 0.3162 - - -
2.0302 7940 0.2859 - - -
2.0327 7950 0.3443 - - -
2.0353 7960 0.4115 - - -
2.0378 7970 0.3435 - - -
2.0404 7980 0.4581 - - -
2.0430 7990 0.3455 - - -
2.0455 8000 0.3615 0.5995 0.9967 0.3965
2.0481 8010 0.371 - - -
2.0506 8020 0.3077 - - -
2.0532 8030 0.4178 - - -
2.0557 8040 0.3805 - - -
2.0583 8050 0.3606 - - -
2.0609 8060 0.3909 - - -
2.0634 8070 0.3757 - - -
2.0660 8080 0.5127 - - -
2.0685 8090 0.405 - - -
2.0711 8100 0.521 - - -
2.0736 8110 0.4128 - - -
2.0762 8120 0.3774 - - -
2.0788 8130 0.482 - - -
2.0813 8140 0.4701 - - -
2.0839 8150 0.404 - - -
2.0864 8160 0.4262 - - -
2.0890 8170 0.523 - - -
2.0915 8180 0.5242 - - -
2.0941 8190 0.4973 - - -
2.0967 8200 0.4716 - - -
2.0992 8210 0.5007 - - -
2.1018 8220 0.5008 - - -
2.1043 8230 0.4949 - - -
2.1069 8240 0.445 - - -
2.1094 8250 0.4623 - - -
2.1120 8260 0.3374 - - -
2.1145 8270 0.3606 - - -
2.1171 8280 0.4374 - - -
2.1197 8290 0.4613 - - -
2.1222 8300 0.4755 - - -
2.1248 8310 0.4307 - - -
2.1273 8320 0.4603 - - -
2.1299 8330 0.5282 - - -
2.1324 8340 0.4861 - - -
2.1350 8350 0.5212 - - -
2.1376 8360 0.4177 - - -
2.1401 8370 0.3925 - - -
2.1427 8380 0.3921 - - -
2.1452 8390 0.4105 - - -
2.1478 8400 0.4798 - - -
2.1503 8410 0.5346 - - -
2.1529 8420 0.5542 - - -
2.1555 8430 0.4403 - - -
2.1580 8440 0.4076 - - -
2.1606 8450 0.681 - - -
2.1631 8460 0.6889 - - -
2.1657 8470 0.4838 - - -
2.1682 8480 0.6886 - - -
2.1708 8490 0.4808 - - -
2.1734 8500 0.5123 - - -
2.1759 8510 0.5152 - - -
2.1785 8520 0.7062 - - -
2.1810 8530 0.7101 - - -
2.1836 8540 0.3942 - - -
2.1861 8550 0.318 - - -
2.1887 8560 0.3881 - - -
2.1913 8570 0.4325 - - -
2.1938 8580 0.4413 - - -
2.1964 8590 0.3022 - - -
2.1989 8600 0.3592 - - -
2.2015 8610 0.5815 - - -
2.2040 8620 0.4044 - - -
2.2066 8630 0.413 - - -
2.2092 8640 0.4562 - - -
2.2117 8650 0.568 - - -
2.2143 8660 0.3666 - - -
2.2168 8670 0.4769 - - -
2.2194 8680 0.5072 - - -
2.2219 8690 0.3284 - - -
2.2245 8700 0.3666 - - -
2.2271 8710 0.3458 - - -
2.2296 8720 0.5223 - - -
2.2322 8730 0.5417 - - -
2.2347 8740 0.5569 - - -
2.2373 8750 0.3708 - - -
2.2398 8760 0.5413 - - -
2.2424 8770 0.4693 - - -
2.2450 8780 0.511 - - -
2.2475 8790 0.5534 - - -
2.2501 8800 0.5845 - - -
2.2526 8810 0.4561 - - -
2.2552 8820 0.4203 - - -
2.2577 8830 0.4074 - - -
2.2603 8840 0.3462 - - -
2.2628 8850 0.3109 - - -
2.2654 8860 0.3785 - - -
2.2680 8870 0.4591 - - -
2.2705 8880 0.4477 - - -
2.2731 8890 0.3638 - - -
2.2756 8900 0.5596 - - -
2.2782 8910 0.5428 - - -
2.2807 8920 0.504 - - -
2.2833 8930 0.3959 - - -
2.2859 8940 0.5195 - - -
2.2884 8950 0.378 - - -
2.2910 8960 0.3662 - - -
2.2935 8970 0.3245 - - -
2.2961 8980 0.344 - - -
2.2986 8990 0.5692 - - -
2.3012 9000 0.3626 0.5766 1.0086 0.3859
2.3038 9010 0.415 - - -
2.3063 9020 0.3807 - - -
2.3089 9030 0.3681 - - -
2.3114 9040 0.3982 - - -
2.3140 9050 0.2995 - - -
2.3165 9060 0.4793 - - -
2.3191 9070 0.4809 - - -
2.3217 9080 0.3722 - - -
2.3242 9090 0.414 - - -
2.3268 9100 0.285 - - -
2.3293 9110 0.4171 - - -
2.3319 9120 0.4916 - - -
2.3344 9130 0.3866 - - -
2.3370 9140 0.3745 - - -
2.3396 9150 0.3166 - - -
2.3421 9160 0.3759 - - -
2.3447 9170 0.4616 - - -
2.3472 9180 0.3187 - - -
2.3498 9190 0.3177 - - -
2.3523 9200 0.3989 - - -
2.3549 9210 0.4651 - - -
2.3575 9220 0.4467 - - -
2.3600 9230 0.5328 - - -
2.3626 9240 0.4087 - - -
2.3651 9250 0.3977 - - -
2.3677 9260 0.461 - - -
2.3702 9270 0.3607 - - -
2.3728 9280 0.3894 - - -
2.3754 9290 0.4252 - - -
2.3779 9300 0.3879 - - -
2.3805 9310 0.4413 - - -
2.3830 9320 0.4101 - - -
2.3856 9330 0.4103 - - -
2.3881 9340 0.4215 - - -
2.3907 9350 0.4526 - - -
2.3932 9360 0.4038 - - -
2.3958 9370 0.3704 - - -
2.3984 9380 0.3339 - - -
2.4009 9390 0.3188 - - -
2.4035 9400 0.3621 - - -
2.4060 9410 0.3884 - - -
2.4086 9420 0.4302 - - -
2.4111 9430 0.3692 - - -
2.4137 9440 0.4398 - - -
2.4163 9450 0.5208 - - -
2.4188 9460 0.4851 - - -
2.4214 9470 0.5485 - - -
2.4239 9480 0.3561 - - -
2.4265 9490 0.3313 - - -
2.4290 9500 0.3709 - - -
2.4316 9510 0.5086 - - -
2.4342 9520 0.6388 - - -
2.4367 9530 0.5319 - - -
2.4393 9540 0.5017 - - -
2.4418 9550 0.5122 - - -
2.4444 9560 0.4663 - - -
2.4469 9570 0.5166 - - -
2.4495 9580 0.5754 - - -
2.4521 9590 0.4125 - - -
2.4546 9600 0.5505 - - -
2.4572 9610 0.608 - - -
2.4597 9620 0.2824 - - -
2.4623 9630 0.3705 - - -
2.4648 9640 0.4262 - - -
2.4674 9650 0.5177 - - -
2.4700 9660 0.7475 - - -
2.4725 9670 0.3571 - - -
2.4751 9680 0.3018 - - -
2.4776 9690 0.3029 - - -
2.4802 9700 0.5481 - - -
2.4827 9710 0.3215 - - -
2.4853 9720 0.4501 - - -
2.4879 9730 0.368 - - -
2.4904 9740 0.3965 - - -
2.4930 9750 0.3616 - - -
2.4955 9760 0.3481 - - -
2.4981 9770 0.3987 - - -
2.5006 9780 0.4095 - - -
2.5032 9790 0.3823 - - -
2.5058 9800 0.32 - - -
2.5083 9810 0.3646 - - -
2.5109 9820 0.4748 - - -
2.5134 9830 0.448 - - -
2.5160 9840 0.4181 - - -
2.5185 9850 0.6087 - - -
2.5211 9860 0.4105 - - -
2.5237 9870 0.3461 - - -
2.5262 9880 0.3705 - - -
2.5288 9890 0.3495 - - -
2.5313 9900 0.3762 - - -
2.5339 9910 0.4793 - - -
2.5364 9920 0.321 - - -
2.5390 9930 0.4528 - - -
2.5415 9940 0.3923 - - -
2.5441 9950 0.4133 - - -
2.5467 9960 0.3466 - - -
2.5492 9970 0.3543 - - -
2.5518 9980 0.3105 - - -
2.5543 9990 0.4383 - - -
2.5569 10000 0.3853 0.5576 0.9768 0.3844
2.5594 10010 0.3365 - - -
2.5620 10020 0.4235 - - -
2.5646 10030 0.3991 - - -
2.5671 10040 0.368 - - -
2.5697 10050 0.3741 - - -
2.5722 10060 0.3664 - - -
2.5748 10070 0.3373 - - -
2.5773 10080 0.4494 - - -
2.5799 10090 0.3406 - - -
2.5825 10100 0.3924 - - -
2.5850 10110 0.4106 - - -
2.5876 10120 0.4308 - - -
2.5901 10130 0.4663 - - -
2.5927 10140 0.397 - - -
2.5952 10150 0.4721 - - -
2.5978 10160 0.3843 - - -
2.6004 10170 0.4077 - - -
2.6029 10180 0.4349 - - -
2.6055 10190 0.403 - - -
2.6080 10200 0.3812 - - -
2.6106 10210 0.4672 - - -
2.6131 10220 0.4537 - - -
2.6157 10230 0.3054 - - -
2.6183 10240 0.4981 - - -
2.6208 10250 0.2706 - - -
2.6234 10260 0.4327 - - -
2.6259 10270 0.3054 - - -
2.6285 10280 0.3542 - - -
2.6310 10290 0.3326 - - -
2.6336 10300 0.3583 - - -
2.6362 10310 0.3705 - - -
2.6387 10320 0.3033 - - -
2.6413 10330 0.316 - - -
2.6438 10340 0.3221 - - -
2.6464 10350 0.463 - - -
2.6489 10360 0.423 - - -
2.6515 10370 0.4502 - - -
2.6541 10380 0.3719 - - -
2.6566 10390 0.2892 - - -
2.6592 10400 0.3709 - - -
2.6617 10410 0.3599 - - -
2.6643 10420 0.4832 - - -
2.6668 10430 0.3146 - - -
2.6694 10440 0.2736 - - -
2.6720 10450 0.3365 - - -
2.6745 10460 0.3415 - - -
2.6771 10470 0.3137 - - -
2.6796 10480 0.3426 - - -
2.6822 10490 0.4208 - - -
2.6847 10500 0.2482 - - -
2.6873 10510 0.3525 - - -
2.6898 10520 0.3096 - - -
2.6924 10530 0.4376 - - -
2.6950 10540 0.3658 - - -
2.6975 10550 0.3332 - - -
2.7001 10560 0.3587 - - -
2.7026 10570 0.3608 - - -
2.7052 10580 0.3369 - - -
2.7077 10590 0.352 - - -
2.7103 10600 0.3093 - - -
2.7129 10610 0.3219 - - -
2.7154 10620 0.4263 - - -
2.7180 10630 0.3825 - - -
2.7205 10640 0.3072 - - -
2.7231 10650 0.3383 - - -
2.7256 10660 0.3173 - - -
2.7282 10670 0.2722 - - -
2.7308 10680 0.3333 - - -
2.7333 10690 0.319 - - -
2.7359 10700 0.2862 - - -
2.7384 10710 0.2929 - - -
2.7410 10720 0.3193 - - -
2.7435 10730 0.3658 - - -
2.7461 10740 0.3537 - - -
2.7487 10750 0.3628 - - -
2.7512 10760 0.3907 - - -
2.7538 10770 0.2873 - - -
2.7563 10780 0.3095 - - -
2.7589 10790 0.3332 - - -
2.7614 10800 0.3009 - - -
2.7640 10810 0.4039 - - -
2.7666 10820 0.2979 - - -
2.7691 10830 0.356 - - -
2.7717 10840 0.4112 - - -
2.7742 10850 0.3208 - - -
2.7768 10860 0.3217 - - -
2.7793 10870 0.3529 - - -
2.7819 10880 0.3316 - - -
2.7845 10890 0.297 - - -
2.7870 10900 0.2915 - - -
2.7896 10910 0.3027 - - -
2.7921 10920 0.3648 - - -
2.7947 10930 0.2733 - - -
2.7972 10940 0.3859 - - -
2.7998 10950 0.3752 - - -
2.8024 10960 0.4194 - - -
2.8049 10970 0.3408 - - -
2.8075 10980 0.3383 - - -
2.8100 10990 0.2718 - - -
2.8126 11000 0.4014 0.5844 1.0047 0.3911
2.8151 11010 0.2805 - - -
2.8177 11020 0.391 - - -
2.8203 11030 0.4106 - - -
2.8228 11040 0.3353 - - -
2.8254 11050 0.3658 - - -
2.8279 11060 0.2835 - - -
2.8305 11070 0.2463 - - -
2.8330 11080 0.3884 - - -
2.8356 11090 0.3813 - - -
2.8381 11100 0.29 - - -
2.8407 11110 0.445 - - -
2.8433 11120 0.4506 - - -
2.8458 11130 0.3571 - - -
2.8484 11140 0.413 - - -
2.8509 11150 0.3169 - - -
2.8535 11160 0.3587 - - -
2.8560 11170 0.4717 - - -
2.8586 11180 0.3758 - - -
2.8612 11190 0.384 - - -
2.8637 11200 0.2987 - - -
2.8663 11210 0.4913 - - -
2.8688 11220 0.474 - - -
2.8714 11230 0.4519 - - -
2.8739 11240 0.3698 - - -
2.8765 11250 0.389 - - -
2.8791 11260 0.3621 - - -
2.8816 11270 0.5046 - - -
2.8842 11280 0.3724 - - -
2.8867 11290 0.4003 - - -
2.8893 11300 0.3711 - - -
2.8918 11310 0.4393 - - -
2.8944 11320 0.6715 - - -
2.8970 11330 0.3232 - - -
2.8995 11340 0.3302 - - -
2.9021 11350 0.3334 - - -
2.9046 11360 0.4568 - - -
2.9072 11370 0.3423 - - -
2.9097 11380 0.3241 - - -
2.9123 11390 0.3032 - - -
2.9149 11400 0.3338 - - -
2.9174 11410 0.3642 - - -
2.9200 11420 0.5081 - - -
2.9225 11430 0.4581 - - -
2.9251 11440 0.4019 - - -
2.9276 11450 0.4075 - - -
2.9302 11460 0.4445 - - -
2.9328 11470 0.3452 - - -
2.9353 11480 0.4069 - - -
2.9379 11490 0.419 - - -
2.9404 11500 0.37 - - -
2.9430 11510 0.4321 - - -
2.9455 11520 0.3993 - - -
2.9481 11530 0.4222 - - -
2.9507 11540 0.4761 - - -
2.9532 11550 0.4414 - - -
2.9558 11560 0.5994 - - -
2.9583 11570 0.4572 - - -
2.9609 11580 0.3808 - - -
2.9634 11590 0.4309 - - -
2.9660 11600 0.4366 - - -
2.9686 11610 0.4015 - - -
2.9711 11620 0.5918 - - -
2.9737 11630 0.6513 - - -
2.9762 11640 0.6191 - - -
2.9788 11650 0.3044 - - -
2.9813 11660 0.4612 - - -
2.9839 11670 0.4446 - - -
2.9864 11680 0.4184 - - -
2.9890 11690 0.383 - - -
2.9916 11700 0.4387 - - -
2.9941 11710 0.529 - - -
2.9967 11720 0.5059 - - -
2.9992 11730 0.4155 - - -

Framework Versions

  • Python: 3.8.10
  • Sentence Transformers: 3.1.1
  • Transformers: 4.45.2
  • PyTorch: 2.4.1+cu121
  • Accelerate: 1.0.1
  • Datasets: 3.1.0
  • Tokenizers: 0.20.3

Citation

BibTeX

Sentence Transformers and SoftmaxLoss

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
25
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for MasterControlAIML/finetuned-ecfr-embeddings

Finetuned
(213)
this model

Datasets used to train MasterControlAIML/finetuned-ecfr-embeddings