SentenceTransformer based on nomic-ai/nomic-embed-text-v1.5
This is a sentence-transformers model finetuned from nomic-ai/nomic-embed-text-v1.5. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: nomic-ai/nomic-embed-text-v1.5
- Maximum Sequence Length: 8192 tokens
- Output Dimensionality: 768 tokens
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: NomicBertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
'九月辛未太祖曾孫舒國公從式進封安定郡王',
'九月初二太祖曾孫舒國公從式進封安定郡王',
'楊難當在漢中大肆燒殺搶劫然後率眾離開了漢中向西返回仇池留下趙溫據守梁州又派他的魏興太守薛健屯駐黃金山',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Training Details
Training Dataset
Unnamed Dataset
- Size: 756,057 training samples
- Columns:
anchor
andpositive
- Approximate statistics based on the first 1000 samples:
anchor positive type string string details - min: 4 tokens
- mean: 20.76 tokens
- max: 199 tokens
- min: 4 tokens
- mean: 31.48 tokens
- max: 602 tokens
- Samples:
anchor positive 虜懷兼弱之威挾廣地之計強兵大眾親自凌殄旍鼓彌年矢石不息
魏人懷有兼併弱小的威嚴胸藏拓展土地的計謀強人的軍隊親自出徵侵逼消滅旌旗戰鼓連年出動戰事不停息
孟子曰 以善服人者未有能服人者也以善養人然後能服天下
孟子說 用自己的善良使人們服從的人沒有能使人服從的用善良影響教導人們才能使天下的人們都信服
開慶初大元兵渡江理宗議遷都平江慶元后諫不可恐搖動民心乃止
開慶初年大元朝部隊渡過長江理宗打算遷都到平江慶元皇后勸諫不可遷都深恐動搖民心理宗才作罷
- Loss:
MultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Evaluation Dataset
Unnamed Dataset
- Size: 84,007 evaluation samples
- Columns:
anchor
andpositive
- Approximate statistics based on the first 1000 samples:
anchor positive type string string details - min: 4 tokens
- mean: 20.23 tokens
- max: 138 tokens
- min: 4 tokens
- mean: 31.45 tokens
- max: 415 tokens
- Samples:
anchor positive 雒陽戶五萬二千八百三十九
雒陽有五萬二千八百三十九戶
拜南青州刺史在任有政績
任南青州刺史很有政績
第六品以下加不得服金釒奠綾錦錦繡七緣綺貂豽裘金叉環鉺及以金校飾器物張絳帳
官位在第六品以下的官員再增加不得穿用金鈿綾錦錦繡七緣綺貂鈉皮衣金叉繯餌以及用金裝飾的器物張絳帳等衣服物品
- Loss:
MultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 16per_device_eval_batch_size
: 16num_train_epochs
: 1warmup_ratio
: 0.1fp16
: Trueload_best_model_at_end
: Truebatch_sampler
: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 16per_device_eval_batch_size
: 16per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonelearning_rate
: 5e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 1max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.1warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Truefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Trueignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Falsehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseeval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falsebatch_sampler
: no_duplicatesmulti_dataset_batch_sampler
: proportional
Training Logs
Click to expand
Epoch | Step | Training Loss | loss |
---|---|---|---|
0.0021 | 100 | 0.4574 | - |
0.0042 | 200 | 0.4089 | - |
0.0063 | 300 | 0.2872 | - |
0.0085 | 400 | 0.2909 | - |
0.0106 | 500 | 0.3076 | - |
0.0127 | 600 | 0.2958 | - |
0.0148 | 700 | 0.2953 | - |
0.0169 | 800 | 0.31 | - |
0.0190 | 900 | 0.3031 | - |
0.0212 | 1000 | 0.263 | - |
0.0233 | 1100 | 0.27 | - |
0.0254 | 1200 | 0.3107 | - |
0.0275 | 1300 | 0.2453 | - |
0.0296 | 1400 | 0.2487 | - |
0.0317 | 1500 | 0.2332 | - |
0.0339 | 1600 | 0.2708 | - |
0.0360 | 1700 | 0.2731 | - |
0.0381 | 1800 | 0.3102 | - |
0.0402 | 1900 | 0.3385 | - |
0.0423 | 2000 | 0.2802 | - |
0.0444 | 2100 | 0.3348 | - |
0.0466 | 2200 | 0.2527 | - |
0.0487 | 2300 | 0.2916 | - |
0.0508 | 2400 | 0.2671 | - |
0.0529 | 2500 | 0.2187 | - |
0.0550 | 2600 | 0.2624 | - |
0.0571 | 2700 | 0.3061 | - |
0.0593 | 2800 | 0.2439 | - |
0.0614 | 2900 | 0.2831 | - |
0.0635 | 3000 | 0.2948 | - |
0.0656 | 3100 | 0.2828 | - |
0.0677 | 3200 | 0.3079 | - |
0.0698 | 3300 | 0.3194 | - |
0.0720 | 3400 | 0.2768 | - |
0.0741 | 3500 | 0.304 | - |
0.0762 | 3600 | 0.3056 | - |
0.0783 | 3700 | 0.2562 | - |
0.0804 | 3800 | 0.3138 | - |
0.0825 | 3900 | 0.3081 | - |
0.0846 | 4000 | 0.2733 | - |
0.0868 | 4100 | 0.3065 | - |
0.0889 | 4200 | 0.25 | - |
0.0910 | 4300 | 0.3076 | - |
0.0931 | 4400 | 0.2935 | - |
0.0952 | 4500 | 0.2644 | - |
0.0973 | 4600 | 0.2943 | - |
0.0995 | 4700 | 0.316 | - |
0.1016 | 4800 | 0.2616 | - |
0.1037 | 4900 | 0.2985 | - |
0.1058 | 5000 | 0.2962 | 0.2798 |
0.1079 | 5100 | 0.2872 | - |
0.1100 | 5200 | 0.2963 | - |
0.1122 | 5300 | 0.2968 | - |
0.1143 | 5400 | 0.2738 | - |
0.1164 | 5500 | 0.3198 | - |
0.1185 | 5600 | 0.294 | - |
0.1206 | 5700 | 0.3296 | - |
0.1227 | 5800 | 0.2605 | - |
0.1249 | 5900 | 0.3187 | - |
0.1270 | 6000 | 0.2657 | - |
0.1291 | 6100 | 0.3267 | - |
0.1312 | 6200 | 0.3839 | - |
0.1333 | 6300 | 0.3077 | - |
0.1354 | 6400 | 0.205 | - |
0.1376 | 6500 | 0.2839 | - |
0.1397 | 6600 | 0.3037 | - |
0.1418 | 6700 | 0.2694 | - |
0.1439 | 6800 | 0.2956 | - |
0.1460 | 6900 | 0.261 | - |
0.1481 | 7000 | 0.3173 | - |
0.1503 | 7100 | 0.2492 | - |
0.1524 | 7200 | 0.2885 | - |
0.1545 | 7300 | 0.3059 | - |
0.1566 | 7400 | 0.2883 | - |
0.1587 | 7500 | 0.2465 | - |
0.1608 | 7600 | 0.2926 | - |
0.1629 | 7700 | 0.2776 | - |
0.1651 | 7800 | 0.2769 | - |
0.1672 | 7900 | 0.2644 | - |
0.1693 | 8000 | 0.2416 | - |
0.1714 | 8100 | 0.254 | - |
0.1735 | 8200 | 0.2485 | - |
0.1756 | 8300 | 0.3029 | - |
0.1778 | 8400 | 0.2938 | - |
0.1799 | 8500 | 0.2936 | - |
0.1820 | 8600 | 0.2804 | - |
0.1841 | 8700 | 0.2408 | - |
0.1862 | 8800 | 0.2849 | - |
0.1883 | 8900 | 0.2954 | - |
0.1905 | 9000 | 0.2902 | - |
0.1926 | 9100 | 0.2845 | - |
0.1947 | 9200 | 0.3143 | - |
0.1968 | 9300 | 0.2514 | - |
0.1989 | 9400 | 0.2508 | - |
0.2010 | 9500 | 0.2782 | - |
0.2032 | 9600 | 0.291 | - |
0.2053 | 9700 | 0.2464 | - |
0.2074 | 9800 | 0.323 | - |
0.2095 | 9900 | 0.2332 | - |
0.2116 | 10000 | 0.2231 | 0.2521 |
0.2137 | 10100 | 0.245 | - |
0.2159 | 10200 | 0.2883 | - |
0.2180 | 10300 | 0.3097 | - |
0.2201 | 10400 | 0.2303 | - |
0.2222 | 10500 | 0.3194 | - |
0.2243 | 10600 | 0.2836 | - |
0.2264 | 10700 | 0.2727 | - |
0.2286 | 10800 | 0.2542 | - |
0.2307 | 10900 | 0.2708 | - |
0.2328 | 11000 | 0.263 | - |
0.2349 | 11100 | 0.3063 | - |
0.2370 | 11200 | 0.2667 | - |
0.2391 | 11300 | 0.2575 | - |
0.2412 | 11400 | 0.2487 | - |
0.2434 | 11500 | 0.2552 | - |
0.2455 | 11600 | 0.2669 | - |
0.2476 | 11700 | 0.2241 | - |
0.2497 | 11800 | 0.3029 | - |
0.2518 | 11900 | 0.2443 | - |
0.2539 | 12000 | 0.2961 | - |
0.2561 | 12100 | 0.2561 | - |
0.2582 | 12200 | 0.2436 | - |
0.2603 | 12300 | 0.2601 | - |
0.2624 | 12400 | 0.2553 | - |
0.2645 | 12500 | 0.2617 | - |
0.2666 | 12600 | 0.2581 | - |
0.2688 | 12700 | 0.2452 | - |
0.2709 | 12800 | 0.2227 | - |
0.2730 | 12900 | 0.2455 | - |
0.2751 | 13000 | 0.2469 | - |
0.2772 | 13100 | 0.2197 | - |
0.2793 | 13200 | 0.3086 | - |
0.2815 | 13300 | 0.2379 | - |
0.2836 | 13400 | 0.2441 | - |
0.2857 | 13500 | 0.2854 | - |
0.2878 | 13600 | 0.2405 | - |
0.2899 | 13700 | 0.2681 | - |
0.2920 | 13800 | 0.2405 | - |
0.2942 | 13900 | 0.251 | - |
0.2963 | 14000 | 0.2477 | - |
0.2984 | 14100 | 0.231 | - |
0.3005 | 14200 | 0.26 | - |
0.3026 | 14300 | 0.2395 | - |
0.3047 | 14400 | 0.2296 | - |
0.3069 | 14500 | 0.2554 | - |
0.3090 | 14600 | 0.2434 | - |
0.3111 | 14700 | 0.2247 | - |
0.3132 | 14800 | 0.267 | - |
0.3153 | 14900 | 0.2212 | - |
0.3174 | 15000 | 0.2744 | 0.2352 |
0.3195 | 15100 | 0.2168 | - |
0.3217 | 15200 | 0.2042 | - |
0.3238 | 15300 | 0.2187 | - |
0.3259 | 15400 | 0.2368 | - |
0.3280 | 15500 | 0.2693 | - |
0.3301 | 15600 | 0.255 | - |
0.3322 | 15700 | 0.2398 | - |
0.3344 | 15800 | 0.247 | - |
0.3365 | 15900 | 0.2431 | - |
0.3386 | 16000 | 0.2349 | - |
0.3407 | 16100 | 0.212 | - |
0.3428 | 16200 | 0.2875 | - |
0.3449 | 16300 | 0.2571 | - |
0.3471 | 16400 | 0.2513 | - |
0.3492 | 16500 | 0.2729 | - |
0.3513 | 16600 | 0.2755 | - |
0.3534 | 16700 | 0.2079 | - |
0.3555 | 16800 | 0.1997 | - |
0.3576 | 16900 | 0.2217 | - |
0.3598 | 17000 | 0.1887 | - |
0.3619 | 17100 | 0.2623 | - |
0.3640 | 17200 | 0.2049 | - |
0.3661 | 17300 | 0.2 | - |
0.3682 | 17400 | 0.2367 | - |
0.3703 | 17500 | 0.2368 | - |
0.3725 | 17600 | 0.2311 | - |
0.3746 | 17700 | 0.2359 | - |
0.3767 | 17800 | 0.2586 | - |
0.3788 | 17900 | 0.2222 | - |
0.3809 | 18000 | 0.2561 | - |
0.3830 | 18100 | 0.2246 | - |
0.3852 | 18200 | 0.1871 | - |
0.3873 | 18300 | 0.2147 | - |
0.3894 | 18400 | 0.2741 | - |
0.3915 | 18500 | 0.2079 | - |
0.3936 | 18600 | 0.2399 | - |
0.3957 | 18700 | 0.2375 | - |
0.3978 | 18800 | 0.2502 | - |
0.4000 | 18900 | 0.2385 | - |
0.4021 | 19000 | 0.2647 | - |
0.4042 | 19100 | 0.1847 | - |
0.4063 | 19200 | 0.2367 | - |
0.4084 | 19300 | 0.2148 | - |
0.4105 | 19400 | 0.1826 | - |
0.4127 | 19500 | 0.225 | - |
0.4148 | 19600 | 0.2415 | - |
0.4169 | 19700 | 0.2998 | - |
0.4190 | 19800 | 0.2435 | - |
0.4211 | 19900 | 0.2283 | - |
0.4232 | 20000 | 0.2782 | 0.2263 |
0.4254 | 20100 | 0.2786 | - |
0.4275 | 20200 | 0.2695 | - |
0.4296 | 20300 | 0.2112 | - |
0.4317 | 20400 | 0.2006 | - |
0.4338 | 20500 | 0.2031 | - |
0.4359 | 20600 | 0.2335 | - |
0.4381 | 20700 | 0.2154 | - |
0.4402 | 20800 | 0.2225 | - |
0.4423 | 20900 | 0.2234 | - |
0.4444 | 21000 | 0.2233 | - |
0.4465 | 21100 | 0.1851 | - |
0.4486 | 21200 | 0.2009 | - |
0.4508 | 21300 | 0.2337 | - |
0.4529 | 21400 | 0.2175 | - |
0.4550 | 21500 | 0.2564 | - |
0.4571 | 21600 | 0.205 | - |
0.4592 | 21700 | 0.233 | - |
0.4613 | 21800 | 0.2027 | - |
0.4635 | 21900 | 0.209 | - |
0.4656 | 22000 | 0.261 | - |
0.4677 | 22100 | 0.1755 | - |
0.4698 | 22200 | 0.2219 | - |
0.4719 | 22300 | 0.2108 | - |
0.4740 | 22400 | 0.212 | - |
0.4762 | 22500 | 0.2676 | - |
0.4783 | 22600 | 0.2314 | - |
0.4804 | 22700 | 0.1838 | - |
0.4825 | 22800 | 0.1967 | - |
0.4846 | 22900 | 0.2412 | - |
0.4867 | 23000 | 0.2203 | - |
0.4888 | 23100 | 0.2183 | - |
0.4910 | 23200 | 0.239 | - |
0.4931 | 23300 | 0.2273 | - |
0.4952 | 23400 | 0.2335 | - |
0.4973 | 23500 | 0.202 | - |
0.4994 | 23600 | 0.2176 | - |
0.5015 | 23700 | 0.2331 | - |
0.5037 | 23800 | 0.1949 | - |
0.5058 | 23900 | 0.2321 | - |
0.5079 | 24000 | 0.2046 | - |
0.5100 | 24100 | 0.2092 | - |
0.5121 | 24200 | 0.2195 | - |
0.5142 | 24300 | 0.2069 | - |
0.5164 | 24400 | 0.2049 | - |
0.5185 | 24500 | 0.2955 | - |
0.5206 | 24600 | 0.2101 | - |
0.5227 | 24700 | 0.2036 | - |
0.5248 | 24800 | 0.2507 | - |
0.5269 | 24900 | 0.2343 | - |
0.5291 | 25000 | 0.2026 | 0.2072 |
0.5312 | 25100 | 0.2288 | - |
0.5333 | 25200 | 0.2208 | - |
0.5354 | 25300 | 0.1914 | - |
0.5375 | 25400 | 0.1903 | - |
0.5396 | 25500 | 0.2156 | - |
0.5418 | 25600 | 0.216 | - |
0.5439 | 25700 | 0.1909 | - |
0.5460 | 25800 | 0.2265 | - |
0.5481 | 25900 | 0.2447 | - |
0.5502 | 26000 | 0.1879 | - |
0.5523 | 26100 | 0.204 | - |
0.5545 | 26200 | 0.2262 | - |
0.5566 | 26300 | 0.2448 | - |
0.5587 | 26400 | 0.1758 | - |
0.5608 | 26500 | 0.2102 | - |
0.5629 | 26600 | 0.2175 | - |
0.5650 | 26700 | 0.2109 | - |
0.5671 | 26800 | 0.202 | - |
0.5693 | 26900 | 0.2075 | - |
0.5714 | 27000 | 0.2021 | - |
0.5735 | 27100 | 0.1799 | - |
0.5756 | 27200 | 0.2084 | - |
0.5777 | 27300 | 0.2114 | - |
0.5798 | 27400 | 0.1851 | - |
0.5820 | 27500 | 0.22 | - |
0.5841 | 27600 | 0.181 | - |
0.5862 | 27700 | 0.2276 | - |
0.5883 | 27800 | 0.1944 | - |
0.5904 | 27900 | 0.1907 | - |
0.5925 | 28000 | 0.2176 | - |
0.5947 | 28100 | 0.2243 | - |
0.5968 | 28200 | 0.2191 | - |
0.5989 | 28300 | 0.2215 | - |
0.6010 | 28400 | 0.1769 | - |
0.6031 | 28500 | 0.1971 | - |
0.6052 | 28600 | 0.179 | - |
0.6074 | 28700 | 0.2308 | - |
0.6095 | 28800 | 0.2453 | - |
0.6116 | 28900 | 0.2293 | - |
0.6137 | 29000 | 0.2191 | - |
0.6158 | 29100 | 0.1988 | - |
0.6179 | 29200 | 0.1878 | - |
0.6201 | 29300 | 0.2215 | - |
0.6222 | 29400 | 0.2188 | - |
0.6243 | 29500 | 0.1821 | - |
0.6264 | 29600 | 0.1856 | - |
0.6285 | 29700 | 0.1907 | - |
0.6306 | 29800 | 0.1999 | - |
0.6328 | 29900 | 0.1803 | - |
0.6349 | 30000 | 0.201 | 0.1948 |
0.6370 | 30100 | 0.179 | - |
0.6391 | 30200 | 0.2073 | - |
0.6412 | 30300 | 0.2676 | - |
0.6433 | 30400 | 0.1824 | - |
0.6454 | 30500 | 0.1995 | - |
0.6476 | 30600 | 0.2097 | - |
0.6497 | 30700 | 0.2421 | - |
0.6518 | 30800 | 0.1745 | - |
0.6539 | 30900 | 0.2682 | - |
0.6560 | 31000 | 0.1892 | - |
0.6581 | 31100 | 0.2054 | - |
0.6603 | 31200 | 0.23 | - |
0.6624 | 31300 | 0.1711 | - |
0.6645 | 31400 | 0.2163 | - |
0.6666 | 31500 | 0.196 | - |
0.6687 | 31600 | 0.1746 | - |
0.6708 | 31700 | 0.2402 | - |
0.6730 | 31800 | 0.2096 | - |
0.6751 | 31900 | 0.1934 | - |
0.6772 | 32000 | 0.2021 | - |
0.6793 | 32100 | 0.1942 | - |
0.6814 | 32200 | 0.2076 | - |
0.6835 | 32300 | 0.1662 | - |
0.6857 | 32400 | 0.1777 | - |
0.6878 | 32500 | 0.1899 | - |
0.6899 | 32600 | 0.2253 | - |
0.6920 | 32700 | 0.221 | - |
0.6941 | 32800 | 0.1797 | - |
0.6962 | 32900 | 0.1884 | - |
0.6984 | 33000 | 0.2185 | - |
0.7005 | 33100 | 0.193 | - |
0.7026 | 33200 | 0.1975 | - |
0.7047 | 33300 | 0.1774 | - |
0.7068 | 33400 | 0.1709 | - |
0.7089 | 33500 | 0.1753 | - |
0.7111 | 33600 | 0.1834 | - |
0.7132 | 33700 | 0.1853 | - |
0.7153 | 33800 | 0.2155 | - |
0.7174 | 33900 | 0.1837 | - |
0.7195 | 34000 | 0.1655 | - |
0.7216 | 34100 | 0.212 | - |
0.7237 | 34200 | 0.2203 | - |
0.7259 | 34300 | 0.2267 | - |
0.7280 | 34400 | 0.208 | - |
0.7301 | 34500 | 0.1545 | - |
0.7322 | 34600 | 0.2003 | - |
0.7343 | 34700 | 0.2058 | - |
0.7364 | 34800 | 0.1837 | - |
0.7386 | 34900 | 0.2199 | - |
0.7407 | 35000 | 0.1931 | 0.1848 |
0.7428 | 35100 | 0.2456 | - |
0.7449 | 35200 | 0.1996 | - |
0.7470 | 35300 | 0.2145 | - |
0.7491 | 35400 | 0.1915 | - |
0.7513 | 35500 | 0.1734 | - |
0.7534 | 35600 | 0.19 | - |
0.7555 | 35700 | 0.182 | - |
0.7576 | 35800 | 0.1808 | - |
0.7597 | 35900 | 0.1625 | - |
0.7618 | 36000 | 0.1813 | - |
0.7640 | 36100 | 0.1412 | - |
0.7661 | 36200 | 0.2279 | - |
0.7682 | 36300 | 0.2444 | - |
0.7703 | 36400 | 0.1882 | - |
0.7724 | 36500 | 0.1731 | - |
0.7745 | 36600 | 0.1794 | - |
0.7767 | 36700 | 0.2577 | - |
0.7788 | 36800 | 0.169 | - |
0.7809 | 36900 | 0.1725 | - |
0.7830 | 37000 | 0.1788 | - |
0.7851 | 37100 | 0.1783 | - |
0.7872 | 37200 | 0.1764 | - |
0.7894 | 37300 | 0.1616 | - |
0.7915 | 37400 | 0.21 | - |
0.7936 | 37500 | 0.2091 | - |
0.7957 | 37600 | 0.1107 | - |
0.7978 | 37700 | 0.1773 | - |
0.7999 | 37800 | 0.1801 | - |
0.8020 | 37900 | 0.1621 | - |
0.8042 | 38000 | 0.189 | - |
0.8063 | 38100 | 0.182 | - |
0.8084 | 38200 | 0.1912 | - |
0.8105 | 38300 | 0.1731 | - |
0.8126 | 38400 | 0.1646 | - |
0.8147 | 38500 | 0.2037 | - |
0.8169 | 38600 | 0.1418 | - |
0.8190 | 38700 | 0.1485 | - |
0.8211 | 38800 | 0.2221 | - |
0.8232 | 38900 | 0.1886 | - |
0.8253 | 39000 | 0.2082 | - |
0.8274 | 39100 | 0.1742 | - |
0.8296 | 39200 | 0.1589 | - |
0.8317 | 39300 | 0.1959 | - |
0.8338 | 39400 | 0.1517 | - |
0.8359 | 39500 | 0.2049 | - |
0.8380 | 39600 | 0.2187 | - |
0.8401 | 39700 | 0.1801 | - |
0.8423 | 39800 | 0.1735 | - |
0.8444 | 39900 | 0.1881 | - |
0.8465 | 40000 | 0.1778 | 0.1787 |
0.8486 | 40100 | 0.1898 | - |
0.8507 | 40200 | 0.2021 | - |
0.8528 | 40300 | 0.1972 | - |
0.8550 | 40400 | 0.156 | - |
0.8571 | 40500 | 0.1791 | - |
0.8592 | 40600 | 0.188 | - |
0.8613 | 40700 | 0.2177 | - |
0.8634 | 40800 | 0.1287 | - |
0.8655 | 40900 | 0.1797 | - |
0.8677 | 41000 | 0.1533 | - |
0.8698 | 41100 | 0.1668 | - |
0.8719 | 41200 | 0.2047 | - |
0.8740 | 41300 | 0.1619 | - |
0.8761 | 41400 | 0.165 | - |
0.8782 | 41500 | 0.1781 | - |
0.8803 | 41600 | 0.2221 | - |
0.8825 | 41700 | 0.2031 | - |
0.8846 | 41800 | 0.1732 | - |
0.8867 | 41900 | 0.1599 | - |
0.8888 | 42000 | 0.1865 | - |
0.8909 | 42100 | 0.1367 | - |
0.8930 | 42200 | 0.1469 | - |
0.8952 | 42300 | 0.1777 | - |
0.8973 | 42400 | 0.1833 | - |
0.8994 | 42500 | 0.2102 | - |
0.9015 | 42600 | 0.164 | - |
0.9036 | 42700 | 0.1752 | - |
0.9057 | 42800 | 0.2186 | - |
0.9079 | 42900 | 0.1824 | - |
0.9100 | 43000 | 0.1796 | - |
0.9121 | 43100 | 0.1626 | - |
0.9142 | 43200 | 0.1623 | - |
0.9163 | 43300 | 0.2036 | - |
0.9184 | 43400 | 0.1365 | - |
0.9206 | 43500 | 0.1792 | - |
0.9227 | 43600 | 0.1583 | - |
0.9248 | 43700 | 0.1943 | - |
0.9269 | 43800 | 0.1931 | - |
0.9290 | 43900 | 0.1777 | - |
0.9311 | 44000 | 0.1633 | - |
0.9333 | 44100 | 0.1841 | - |
0.9354 | 44200 | 0.1674 | - |
0.9375 | 44300 | 0.1958 | - |
0.9396 | 44400 | 0.1831 | - |
0.9417 | 44500 | 0.1899 | - |
0.9438 | 44600 | 0.177 | - |
0.9460 | 44700 | 0.1881 | - |
0.9481 | 44800 | 0.1643 | - |
0.9502 | 44900 | 0.1462 | - |
0.9523 | 45000 | 0.2118 | 0.1719 |
0.9544 | 45100 | 0.1655 | - |
0.9565 | 45200 | 0.1567 | - |
0.9586 | 45300 | 0.1429 | - |
0.9608 | 45400 | 0.1718 | - |
0.9629 | 45500 | 0.1549 | - |
0.9650 | 45600 | 0.1556 | - |
0.9671 | 45700 | 0.1323 | - |
0.9692 | 45800 | 0.1988 | - |
0.9713 | 45900 | 0.15 | - |
0.9735 | 46000 | 0.1546 | - |
0.9756 | 46100 | 0.1472 | - |
0.9777 | 46200 | 0.196 | - |
0.9798 | 46300 | 0.1913 | - |
0.9819 | 46400 | 0.2261 | - |
0.9840 | 46500 | 0.1842 | - |
0.9862 | 46600 | 0.172 | - |
0.9883 | 46700 | 0.1925 | - |
0.9904 | 46800 | 0.1928 | - |
0.9925 | 46900 | 0.1698 | - |
0.9946 | 47000 | 0.1778 | - |
0.9967 | 47100 | 0.1497 | - |
0.9989 | 47200 | 0.1506 | - |
- The bold row denotes the saved checkpoint.
Framework Versions
- Python: 3.12.4
- Sentence Transformers: 3.1.0.dev0
- Transformers: 4.42.4
- PyTorch: 2.3.1+cpu
- Accelerate: 0.32.1
- Datasets: 2.20.0
- Tokenizers: 0.19.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
- Downloads last month
- 118
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.