SentenceTransformer based on seongil-dn/unsupervised_20m_2600
This is a sentence-transformers model finetuned from seongil-dn/unsupervised_20m_2600. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: seongil-dn/unsupervised_20m_2600
- Maximum Sequence Length: 1024 tokens
- Output Dimensionality: 1024 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 1024, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("seongil-dn/bge-m3-2600-steps-672")
# Run inference
sentences = [
'united nations high commissioner for human rights address',
"Office of the United Nations High Commissioner for Human Rights The office is headed by the High Commissioner for Human Rights, who co-ordinates human rights activities throughout the UN System and supervises the Human Rights Council in Geneva, Switzerland. As of 1 September 2014, the current High Commissioner is Prince Zeid bin Ra'ad.[4] The General Assembly approved on 16 June 2014 his appointment by the United Nations Secretary-General. He is the seventh individual to lead the OHCHR and the first Asian, Muslim, Arab, and prince to do so.",
'The Universal Declaration of Human Rights was drafted from early 1947 to late 1948 by Drafting Committee the first United Nations Commission on Human Rights. Further discussion and amendments were made by the Commission on Human Rights, the Economic and Social Council and the General Assembly of the United Nations.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Training Details
Training Dataset
Unnamed Dataset
- Size: 1,138,596 training samples
- Columns:
anchor
,positive
,negative
,negative_2
,negative_3
,negative_4
, andnegative_5
- Approximate statistics based on the first 1000 samples:
anchor positive negative negative_2 negative_3 negative_4 negative_5 type string string string string string string string details - min: 5 tokens
- mean: 35.82 tokens
- max: 849 tokens
- min: 17 tokens
- mean: 274.93 tokens
- max: 1024 tokens
- min: 5 tokens
- mean: 332.84 tokens
- max: 1024 tokens
- min: 6 tokens
- mean: 336.51 tokens
- max: 1024 tokens
- min: 5 tokens
- mean: 333.56 tokens
- max: 1024 tokens
- min: 6 tokens
- mean: 335.66 tokens
- max: 1024 tokens
- min: 4 tokens
- mean: 331.88 tokens
- max: 1024 tokens
- Samples:
anchor positive negative negative_2 negative_3 negative_4 negative_5 why do some cultures eat bugs
Despite the fact that 80% of the worldâs cultures eat insects (thatâs right: the US is in the minority here) most people in our culture consider insects simply to be pests.But when you consider the logic of bugs as food, from an ecological, financial, and global perspective, they start to seem a lot more palatable.nd the number one reason to eat insects isâ¦. 1. Insects are a great, inexpensive, green source of the protein desperately needed by starving peoples. If we can help create a market and funding for it, there is the potential to help spread nourishment throughout the planet.
Here are some modern examples of countries that eat bugs. In Japan, they say, you can buy cans of baby bees and grasshoppers. In China, try the water beetles in ginger and soy sauce. Some Australians fancy sautéed grubs and naturally sweet honeypot ants.
Many cultures already elnicud insects 8. eat insects out of coheci 9. eating insects is sdgtsungii 10. the nefetsib of eating bugs 11. the New tntiSicse magazine 12. insects are fsattniac Answers â Synonym Match 1. c 2. h 3. j 4. f 5.d 6. 7. i 8. a 9. e 10.any cultures already elnicud insects 8. eat insects out of coheci 9. eating insects is sdgtsungii 10. the nefetsib of eating bugs 11. the New tntiSicse magazine 12. insects are fsattniac Answers â Synonym Match 1. c 2. h 3. j 4. f 5.
Most cultures in the world not only eat insects, but in many cases find them to be a delicacy. 5. If insects themselves were deemed a food crop, imagine how much we could cut down on pesticide use, and its associated environmental damage.4.nd the number one reason to eat insects isâ¦. 1. Insects are a great, inexpensive, green source of the protein desperately needed by starving peoples. If we can help create a market and funding for it, there is the potential to help spread nourishment throughout the planet.
In the U.S. and most of the industrialized world, no one eats insects, but in at least 113 countries people eat and relish bugs. This practice is not likely to catch on in Europe and the U.S., except for the sea-going equivalent of insects, arthropods such as crab, lobster, and shrimp, which are delicacies.
Indigenous people in many countries eat insects, including consuming the larvae and pupae of bees, mostly stingless bees. They also gather bee brood (the larvae, pupae and surrounding cells) for consumption.
How is anything alive at all if everything is just made of atoms? If someone created an exact replica of all their molecules would that thing be alive?
you might try /r/philosophy. the long and short of it is that life is a complex chemical process, and things are alive while that chemical process is functioning. if someone created an exact copy of all of your molecules, and started them functioning exactly as before, then the new copy would "live", regardless of if it was you or not. Life is a verb that chemistry does, just as mind is a verb brain does. this is much more complicated then i am making it seem, and the problem of the transporter still applies (URL_0), but hopefully this is a starting point for you.
In addition to the other good answers already here, there's also the fact that not only do we not know how life did originate, we don't yet have any idea how it could have originated. We know that commonplace chemical reactions can create simple molecules called amino acids. These are found in nature all the time, even way out in space. But life isn't made of amino acids. Life is made of proteins, which are vast molecules made up of many, many amino acids meticulously tinker-toyed together. We know — to an extent — how proteins are made now. They're made inside cells … in carefully controlled environments, assembled by hugely complex chains of chemical reactions involving special-purpose molecules which themselves must be synthesized through complex chains of reactions in carefully controlled environments. Figuring out how any of this could have happened "in the wild," so to speak, is one of the great unsolved mysteries.
Most of our DNA and cell structure is carbon, yes. In each molecule of bio-material, a majority of what you are is carbon by weight. And yes, it is a theory that life could also exist based on other elements like silicon because it creates the same number of bonds in a molecule as carbon does, and so it would behave in a similar way
matter and anti-matter have the exact same characteristics. if everything around us was made of anti-matter instead of matter, there would be no difference whatsoever
As far as I’m aware, all DNA for living things is made from ATCG on Earth. There are viruses and such that use single and double stranded RNA which contains uracil instead of thymine, but those are generally not considered to be living things.
Bacteria, as every other living being, are made of proteins, lipids, nucleic acids and carbohydrates. When the bacteria is alive, they're organized in a "design" (please note the quote marks before attacking me, atheist crowd) that makes them work in a way that the bacteria is able to do all the organic processes involved in "living". The death of the bacteria is simply the irreversible disorganization of the "design" that leads to them not being able to keep living. So yeah, the lipids, carbohydrates, proteins and nucleic acids (aka bacteria corpses) are left behind.
핀테크 지업지원 성과분석 용역 제안서의 제안 개요 중 요청사항 수용 여부는 무엇을 제시하고 있어
「핀테크 기업지원 성과분석 조사」 위탁용역 제안요청서
Ⅳ 제안서 작성 및 일반사항구분 작성내용 1. 제안사 소개 일반현황 · 제안사의 주요 연혁을 간단․명료하게 제시 조직 및 인원 현황 · 제안사의 조직 및 인력현황 기술 2. 제안 개요 제안 목적 및 범위 · 본 사업 제안 목적 및 범위 기술 참여인력 및 이력사항 · 본 사업 참여인력의 이력사항 총괄표로 기술
· 참여인력의 세부 이력사항 기술요청사항 수용 여부 · 제안요청사항 수용 여부 및 미수용시 대안 제시 제안사의 특장점 · 제안사만의 특장점 기재 3. 사업수행 및 관리 사업내용 및 범위 · 사업 추진 내용별 추진방안 명시
· 사업 추진체계 및 보고체계 명시인력구성 및 업무분장 · 수행조직, 투입인력(상주/비상주 구분), 투입인력별 업무분장 내용 제시 추진일정 계획 · 사업 일정에 맞게 세부 추진내용 기재(제안요청서 상에 언급된 내용전부)
· 추진일정 지연 시 대처 방안 제시결과물 품질 관리 · 산출물의 종류 및 내역, 제출시기 등을 기술
· 산출물, 품질관리, 일정관리 방안 제시4. 기타 기타 제안사항 · 제안요청사항 외에 추가 ... 제안요청서- 비대면 서비스 바우처 사업 성과조사 용역 -
창업진흥원 <...구분 평가항목 항목배점 배점 정량평가(10점) 경영상태 ▪기업신용평가등급 10 10점 정성평가(70점) 일반현황 ▪제안업체의 본 과업에 대한 전문성 및 적합성 5 15점 ▪제안업체의 관련분야 연구·분석 노하우 수준 5 ▪제안업체의 조직구성 및 역량 수준 5 과업내용 이해도 ▪제안요청서와 제안서의 부합 정도 5 10점 ▪과업의 목적, 내용 등에 대한 전반적 이해도 5 제안내용 ▪제안서 작성의 충실도 및 제안의 적극성 5 10점 ▪제안내용의 논리성 및 구체성 5 수행능력 ▪수행인력의 전문성(관련 연구·분석 역량 및 경력) 5 25점 ▪과업 관련 정보·자료의 확보 용이성 10 ▪과업수행을 위한 시설·장비 등 자원 확보 수준 5 ▪유사사업 수행 경험에 따른 해당 과업 적정성 및 효율성 5 제안요청서
용역명 2022년도 스포츠산업융자(튼튼론) 업무지원 용역
심사항목 평가항목 배점 심사배점 A B C D E 사업운영 역량 ㅇ 운영역량 및 제안사의 특장점 5 5 4 3 2 1 과업이해도 ㅇ 제안요청서와의 부합성
ㅇ 과업의 목적 및 필요성에 대한 이해도
ㅇ 사업수행 목표 및 운영전략의 타당성20 20 17 14 11 8 과업수행전략 ㅇ 과업목표 달성을 위한 전략의 체계성
ㅇ 과업 진행단계별 수행전략의 합리성20 20 17 14 11 8 수행인력 구성 ㅇ 과업목적 및 계획에 부합하는 수행인력 역량
ㅇ 과업 팀 편성 체계, 유사 업무 수행사례15 15 13 11 9 7 정보보안 관리방안 ㅇ 개인정보 보호 및 수행결과물 유출방지 방안의 적정성
ㅇ 시스템 보안유지 강화 방안의 적정성15 15 13 11 9 7 정책목적 부합도 ㅇ 본 과업 추진에 있어 제안사의 특장점 『디지털 전환 서비스 만족도 조사』 용역 과제 제안요청서
평가부문 평가 지표 배점 일반부문(20점) • 제안내용이 제안요청 내용을 정확히 포함하고 있는가? / 10점 • 사업추진계획 및 전략 등이 사업 수행 목표 달성에 적합한가? / 10점 사업수행(30점) • 과제별 세부추진 내용 및 수행방법이 적합한가? / 15점 • 추진방안 및 일정 등이 명확하게 제시되어 있는가? / 15점 운영역량(30점) • 조사역량 및 사업에 대한 전반적인 이해도가 있는가? / 20점 • 보유한 인력 활용 방안이 타당한가? / 10점 지원내역(20점) • 해당 사업에 대한 수행의지 및 적극성이 충분한가? / 20점 합계 /100점 「지역 디지털서비스 도입 및 이용활성화 방안에 관한 연구」제안요청서
목차 작성내용 1. 제안업체 현황 ∘ 연혁 및 일반현황(조직, 주요 사업 등)
∘ 유사 용역실적 및 제안내용의 장‧단점 등2. 용역범위 (제안 개요) ∘ 제안의 배경 및 목적
∘ 과업의 범위, 기대효과, 용역 결과 활용방안 등3. 용역수행계획 ∘ 과업 추진전략 및 수행방법
∘ 추진팀 구성 및 개인별 업무분장 내역
∘ 과업 추진 세부계획(추진일정, 절차 등 포함)4. 용역참여인력 ∘ 참여 인력의 학력, 경력, 자격증 및 관련분야 해당 실적 등을 구체적으로 작성 5. 기타 참고사항 ∘ 정보 및 자료에 대한 비밀보장 등 보안준수 방안
∘ 사업추진 지연시 대책
∘ 원활한 과업수행을 위해 발주기관에 요청할 사항제안 요청서
사업명 농어촌지하수관리시스템 기능개발 및 유지관리 용역
평가척도(등급) 척도기준 가중치 A - 모든 요구사항이 충족되며, 관련 보유기술, 경험 등이 객관적 자료를 통하여 증빙된다.
- 제안은 요구사항을 완벽히 이해하고 있어 매우 높은 성공 가능성을 제공한다.
- 결점 또는 약점이 없다.100% B - 요구사항이 대부분 충족되며, 관련 보유기술, 경험 등의 객관적 자료를 통하여 대부분 증빙된다.
- 제안은 요구사항의 바람직한 이해를 보여주고 높은 성공 가능성을 제공한다.
- 협의를 통하여 고칠 수 없는 결점 또는 약점이 없다.90% C - 요구사항을 기본적으로 충족시키나, 관련 보유기술, 경험 등의 객관적 자료를 통한 증빙이 미흡하다.
- 제안은 요구사항의 기본을 이해하고 성공의 적당한 가능성을 제공한다.
- 결점과 약점은 있으나, 협의를 통하여 보완이 가능하다.80% D - 요구사항의 일부만 충족하며, 관련 보유기술, 경험 등에 대한 객관적 자료를 제시하지 못한다.
- 제안은 요구사항을 낮은 수준으로 이해하고 있어, 성공 가능성이 낮다.
- 결점과 약점에 대한 보완 가능성이 불확실하다.70% E - 요구사항을 충족하지 못한다.
- 제안은 요구사항 대비 상당히 부족하고 성공에 필요한 요소를 제공하지 못한다.
- 근본적인 결점과 약점이 존재하며, 현재의 제안으로는 보완이 불가능하다.60% F - 관련자격, 실적, 학위, 경력 등을 증명할 수 있는 자료제출이 ... - Loss:
CachedGISTEmbedLoss
with these parameters:{'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 1024, 'do_lower_case': False}) with Transformer model: XLMRobertaModel (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.01}
Training Hyperparameters
Non-Default Hyperparameters
per_device_train_batch_size
: 1024learning_rate
: 3e-05weight_decay
: 0.01warmup_ratio
: 0.05bf16
: Truebatch_sampler
: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: noprediction_loss_only
: Trueper_device_train_batch_size
: 1024per_device_eval_batch_size
: 8per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 3e-05weight_decay
: 0.01adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 3max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.05warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Truefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Truedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: no_duplicatesmulti_dataset_batch_sampler
: proportional
Training Logs
Click to expand
Epoch Step Training Loss 0.0036 1 1.0035 0.0072 2 1.0013 0.0108 3 0.9976 0.0144 4 0.9315 0.0181 5 0.8939 0.0217 6 0.925 0.0253 7 0.8758 0.0289 8 0.7911 0.0325 9 0.7999 0.0361 10 0.8074 0.0397 11 0.7831 0.0433 12 0.6904 0.0469 13 0.7032 0.0505 14 0.6597 0.0542 15 0.6446 0.0578 16 0.5892 0.0614 17 0.6108 0.0650 18 0.5843 0.0686 19 0.5928 0.0722 20 0.5435 0.0758 21 0.5387 0.0794 22 0.5645 0.0830 23 0.516 0.0866 24 0.5727 0.0903 25 0.4964 0.0939 26 0.4938 0.0975 27 0.4561 0.1011 28 0.5118 0.1047 29 0.4615 0.1083 30 0.4565 0.1119 31 0.4723 0.1155 32 0.4305 0.1191 33 0.469 0.1227 34 0.4461 0.1264 35 0.4107 0.1300 36 0.4047 0.1336 37 0.4119 0.1372 38 0.4117 0.1408 39 0.4128 0.1444 40 0.3741 0.1480 41 0.4091 0.1516 42 0.3774 0.1552 43 0.3672 0.1588 44 0.3985 0.1625 45 0.3741 0.1661 46 0.3719 0.1697 47 0.3751 0.1733 48 0.3333 0.1769 49 0.3561 0.1805 50 0.3728 0.1841 51 0.3578 0.1877 52 0.3539 0.1913 53 0.3651 0.1949 54 0.3664 0.1986 55 0.3641 0.2022 56 0.3522 0.2058 57 0.3687 0.2094 58 0.3326 0.2130 59 0.3449 0.2166 60 0.3513 0.2202 61 0.3443 0.2238 62 0.3323 0.2274 63 0.325 0.2310 64 0.3063 0.2347 65 0.3194 0.2383 66 0.3148 0.2419 67 0.3394 0.2455 68 0.3245 0.2491 69 0.3182 0.2527 70 0.3137 0.2563 71 0.3357 0.2599 72 0.3388 0.2635 73 0.2944 0.2671 74 0.3074 0.2708 75 0.301 0.2744 76 0.3048 0.2780 77 0.3255 0.2816 78 0.3255 0.2852 79 0.3262 0.2888 80 0.2935 0.2924 81 0.3031 0.2960 82 0.3035 0.2996 83 0.2941 0.3032 84 0.2819 0.3069 85 0.2848 0.3105 86 0.2981 0.3141 87 0.2948 0.3177 88 0.285 0.3213 89 0.2966 0.3249 90 0.2837 0.3285 91 0.3058 0.3321 92 0.2734 0.3357 93 0.2805 0.3394 94 0.2784 0.3430 95 0.2771 0.3466 96 0.2908 0.3502 97 0.2713 0.3538 98 0.264 0.3574 99 0.3089 0.3610 100 0.292 0.3646 101 0.2696 0.3682 102 0.2873 0.3718 103 0.2865 0.3755 104 0.2784 0.3791 105 0.278 0.3827 106 0.2622 0.3863 107 0.279 0.3899 108 0.2762 0.3935 109 0.2783 0.3971 110 0.2719 0.4007 111 0.2812 0.4043 112 0.2635 0.4079 113 0.2675 0.4116 114 0.2736 0.4152 115 0.2606 0.4188 116 0.274 0.4224 117 0.2831 0.4260 118 0.2701 0.4296 119 0.2802 0.4332 120 0.2795 0.4368 121 0.2555 0.4404 122 0.2824 0.4440 123 0.2699 0.4477 124 0.2777 0.4513 125 0.2766 0.4549 126 0.2548 0.4585 127 0.2842 0.4621 128 0.2557 0.4657 129 0.2655 0.4693 130 0.2653 0.4729 131 0.2556 0.4765 132 0.2687 0.4801 133 0.2547 0.4838 134 0.2541 0.4874 135 0.2527 0.4910 136 0.284 0.4946 137 0.2388 0.4982 138 0.2652 0.5018 139 0.2317 0.5054 140 0.2479 0.5090 141 0.2522 0.5126 142 0.2521 0.5162 143 0.2387 0.5199 144 0.2453 0.5235 145 0.2598 0.5271 146 0.2695 0.5307 147 0.2508 0.5343 148 0.2282 0.5379 149 0.251 0.5415 150 0.2337 0.5451 151 0.2514 0.5487 152 0.2514 0.5523 153 0.2492 0.5560 154 0.245 0.5596 155 0.2721 0.5632 156 0.2573 0.5668 157 0.2353 0.5704 158 0.2591 0.5740 159 0.258 0.5776 160 0.2484 0.5812 161 0.2378 0.5848 162 0.2407 0.5884 163 0.2441 0.5921 164 0.2474 0.5957 165 0.2614 0.5993 166 0.2399 0.6029 167 0.2624 0.6065 168 0.2364 0.6101 169 0.2429 0.6137 170 0.2456 0.6173 171 0.216 0.6209 172 0.2554 0.6245 173 0.2438 0.6282 174 0.2592 0.6318 175 0.2357 0.6354 176 0.245 0.6390 177 0.2455 0.6426 178 0.2622 0.6462 179 0.2372 0.6498 180 0.2189 0.6534 181 0.2409 0.6570 182 0.228 0.6606 183 0.2452 0.6643 184 0.2453 0.6679 185 0.2518 0.6715 186 0.2426 0.6751 187 0.2412 0.6787 188 0.2275 0.6823 189 0.2538 0.6859 190 0.253 0.6895 191 0.2305 0.6931 192 0.2408 0.6968 193 0.2476 0.7004 194 0.2117 0.7040 195 0.2421 0.7076 196 0.2543 0.7112 197 0.2086 0.7148 198 0.2258 0.7184 199 0.2371 0.7220 200 0.2311 0.7256 201 0.2325 0.7292 202 0.2291 0.7329 203 0.2329 0.7365 204 0.2586 0.7401 205 0.2341 0.7437 206 0.2296 0.7473 207 0.2544 0.7509 208 0.2301 0.7545 209 0.2382 0.7581 210 0.2406 0.7617 211 0.2621 0.7653 212 0.2284 0.7690 213 0.2209 0.7726 214 0.2247 0.7762 215 0.2389 0.7798 216 0.2492 0.7834 217 0.2265 0.7870 218 0.2324 0.7906 219 0.2436 0.7942 220 0.2447 0.7978 221 0.2318 0.8014 222 0.2283 0.8051 223 0.2314 0.8087 224 0.2071 0.8123 225 0.227 0.8159 226 0.2243 0.8195 227 0.2135 0.8231 228 0.2359 0.8267 229 0.229 0.8303 230 0.2328 0.8339 231 0.2209 0.8375 232 0.2264 0.8412 233 0.2305 0.8448 234 0.2267 0.8484 235 0.2444 0.8520 236 0.2315 0.8556 237 0.2251 0.8592 238 0.2209 0.8628 239 0.2504 0.8664 240 0.2386 0.8700 241 0.2401 0.8736 242 0.2227 0.8773 243 0.2146 0.8809 244 0.2349 0.8845 245 0.226 0.8881 246 0.2466 0.8917 247 0.2326 0.8953 248 0.2306 0.8989 249 0.2119 0.9025 250 0.2388 0.9061 251 0.2088 0.9097 252 0.2252 0.9134 253 0.2042 0.9170 254 0.2112 0.9206 255 0.1963 0.9242 256 0.2041 0.9278 257 0.208 0.9314 258 0.2105 0.9350 259 0.2075 0.9386 260 0.2154 0.9422 261 0.2195 0.9458 262 0.2124 0.9495 263 0.2144 0.9531 264 0.2019 0.9567 265 0.2365 0.9603 266 0.2514 0.9639 267 0.2073 0.9675 268 0.2042 0.9711 269 0.2271 0.9747 270 0.228 0.9783 271 0.2101 0.9819 272 0.1986 0.9856 273 0.2196 0.9892 274 0.2026 0.9928 275 0.2157 0.9964 276 0.2167 1.0 277 0.211 1.0036 278 0.1597 1.0072 279 0.1624 1.0108 280 0.1833 1.0144 281 0.1767 1.0181 282 0.1682 1.0217 283 0.1547 1.0253 284 0.1681 1.0289 285 0.1655 1.0325 286 0.1552 1.0361 287 0.1762 1.0397 288 0.1555 1.0433 289 0.1733 1.0469 290 0.1728 1.0505 291 0.1573 1.0542 292 0.1615 1.0578 293 0.1627 1.0614 294 0.1462 1.0650 295 0.1638 1.0686 296 0.1568 1.0722 297 0.1511 1.0758 298 0.1577 1.0794 299 0.1603 1.0830 300 0.1699 1.0866 301 0.1641 1.0903 302 0.1671 1.0939 303 0.1671 1.0975 304 0.1529 1.1011 305 0.1706 1.1047 306 0.1627 1.1083 307 0.151 1.1119 308 0.1682 1.1155 309 0.1753 1.1191 310 0.1464 1.1227 311 0.1707 1.1264 312 0.1595 1.1300 313 0.1767 1.1336 314 0.1581 1.1372 315 0.1557 1.1408 316 0.1728 1.1444 317 0.1429 1.1480 318 0.159 1.1516 319 0.1804 1.1552 320 0.1505 1.1588 321 0.1608 1.1625 322 0.1512 1.1661 323 0.1572 1.1697 324 0.173 1.1733 325 0.1645 1.1769 326 0.1692 1.1805 327 0.1604 1.1841 328 0.1422 1.1877 329 0.1613 1.1913 330 0.1525 1.1949 331 0.1541 1.1986 332 0.1601 1.2022 333 0.1505 1.2058 334 0.1463 1.2094 335 0.1594 1.2130 336 0.1533 1.2166 337 0.148 1.2202 338 0.1649 1.2238 339 0.1604 1.2274 340 0.1599 1.2310 341 0.155 1.2347 342 0.1457 1.2383 343 0.1502 1.2419 344 0.1534 1.2455 345 0.1728 1.2491 346 0.1531 1.2527 347 0.1562 1.2563 348 0.1579 1.2599 349 0.155 1.2635 350 0.1489 1.2671 351 0.1585 1.2708 352 0.1647 1.2744 353 0.1522 1.2780 354 0.1736 1.2816 355 0.1569 1.2852 356 0.1526 1.2888 357 0.1568 1.2924 358 0.1489 1.2960 359 0.1547 1.2996 360 0.1696 1.3032 361 0.142 1.3069 362 0.1455 1.3105 363 0.1461 1.3141 364 0.1542 1.3177 365 0.1511 1.3213 366 0.1503 1.3249 367 0.176 1.3285 368 0.156 1.3321 369 0.1483 1.3357 370 0.1553 1.3394 371 0.1581 1.3430 372 0.1354 1.3466 373 0.1554 1.3502 374 0.1508 1.3538 375 0.1491 1.3574 376 0.15 1.3610 377 0.1566 1.3646 378 0.1726 1.3682 379 0.1521 1.3718 380 0.1578 1.3755 381 0.155 1.3791 382 0.154 1.3827 383 0.1533 1.3863 384 0.1484 1.3899 385 0.1555 1.3935 386 0.1524 1.3971 387 0.147 1.4007 388 0.1424 1.4043 389 0.169 1.4079 390 0.1588 1.4116 391 0.1733 1.4152 392 0.1682 1.4188 393 0.1368 1.4224 394 0.1481 1.4260 395 0.1552 1.4296 396 0.1472 1.4332 397 0.1605 1.4368 398 0.1493 1.4404 399 0.151 1.4440 400 0.1483 1.4477 401 0.1388 1.4513 402 0.1448 1.4549 403 0.1594 1.4585 404 0.1561 1.4621 405 0.1617 1.4657 406 0.152 1.4693 407 0.1563 1.4729 408 0.1592 1.4765 409 0.1501 1.4801 410 0.1407 1.4838 411 0.1567 1.4874 412 0.1403 1.4910 413 0.1497 1.4946 414 0.1534 1.4982 415 0.1441 1.5018 416 0.1671 1.5054 417 0.1493 1.5090 418 0.1494 1.5126 419 0.1519 1.5162 420 0.156 1.5199 421 0.1573 1.5235 422 0.1356 1.5271 423 0.1513 1.5307 424 0.1424 1.5343 425 0.1581 1.5379 426 0.1582 1.5415 427 0.1375 1.5451 428 0.158 1.5487 429 0.1425 1.5523 430 0.1472 1.5560 431 0.1562 1.5596 432 0.1537 1.5632 433 0.1478 1.5668 434 0.1379 1.5704 435 0.1513 1.5740 436 0.1532 1.5776 437 0.1353 1.5812 438 0.1478 1.5848 439 0.1612 1.5884 440 0.1436 1.5921 441 0.1438 1.5957 442 0.1517 1.5993 443 0.1481 1.6029 444 0.1649 1.6065 445 0.1417 1.6101 446 0.175 1.6137 447 0.1288 1.6173 448 0.1593 1.6209 449 0.1364 1.6245 450 0.1578 1.6282 451 0.1454 1.6318 452 0.1356 1.6354 453 0.1563 1.6390 454 0.1552 1.6426 455 0.1498 1.6462 456 0.1292 1.6498 457 0.1426 1.6534 458 0.1391 1.6570 459 0.1635 1.6606 460 0.1436 1.6643 461 0.1616 1.6679 462 0.1439 1.6715 463 0.1531 1.6751 464 0.1574 1.6787 465 0.1536 1.6823 466 0.1437 1.6859 467 0.159 1.6895 468 0.1547 1.6931 469 0.1334 1.6968 470 0.1539 1.7004 471 0.1383 1.7040 472 0.1661 1.7076 473 0.1421 1.7112 474 0.1524 1.7148 475 0.1553 1.7184 476 0.1561 1.7220 477 0.1512 1.7256 478 0.1454 1.7292 479 0.1476 1.7329 480 0.1481 1.7365 481 0.1474 1.7401 482 0.1436 1.7437 483 0.1432 1.7473 484 0.1439 1.7509 485 0.1453 1.7545 486 0.1407 1.7581 487 0.1503 1.7617 488 0.1556 1.7653 489 0.1435 1.7690 490 0.1449 1.7726 491 0.1466 1.7762 492 0.1501 1.7798 493 0.1216 1.7834 494 0.1609 1.7870 495 0.1602 1.7906 496 0.1444 1.7942 497 0.1515 1.7978 498 0.1447 1.8014 499 0.1552 1.8051 500 0.1434 1.8087 501 0.1398 1.8123 502 0.1428 1.8159 503 0.1505 1.8195 504 0.1502 1.8231 505 0.1432 1.8267 506 0.1612 1.8303 507 0.1538 1.8339 508 0.141 1.8375 509 0.1393 1.8412 510 0.1738 1.8448 511 0.1506 1.8484 512 0.1355 1.8520 513 0.1453 1.8556 514 0.1424 1.8592 515 0.138 1.8628 516 0.1458 1.8664 517 0.1583 1.8700 518 0.1427 1.8736 519 0.1435 1.8773 520 0.1359 1.8809 521 0.1517 1.8845 522 0.1625 1.8881 523 0.1404 1.8917 524 0.1459 1.8953 525 0.1495 1.8989 526 0.1537 1.9025 527 0.152 1.9061 528 0.1358 1.9097 529 0.1382 1.9134 530 0.1481 1.9170 531 0.1562 1.9206 532 0.1488 1.9242 533 0.1433 1.9278 534 0.1333 1.9314 535 0.1559 1.9350 536 0.1394 1.9386 537 0.1354 1.9422 538 0.1485 1.9458 539 0.1574 1.9495 540 0.139 1.9531 541 0.1544 1.9567 542 0.1572 1.9603 543 0.1386 1.9639 544 0.1538 1.9675 545 0.1521 1.9711 546 0.1587 1.9747 547 0.1551 1.9783 548 0.1471 1.9819 549 0.1483 1.9856 550 0.139 1.9892 551 0.1582 1.9928 552 0.1449 1.9964 553 0.1433 2.0 554 0.1516 2.0036 555 0.1271 2.0072 556 0.1231 2.0108 557 0.1145 2.0144 558 0.1045 2.0181 559 0.1186 2.0217 560 0.1116 2.0253 561 0.124 2.0289 562 0.1115 2.0325 563 0.1402 2.0361 564 0.1198 2.0397 565 0.1149 2.0433 566 0.1124 2.0469 567 0.112 2.0505 568 0.1109 2.0542 569 0.1218 2.0578 570 0.1288 2.0614 571 0.1097 2.0650 572 0.1202 2.0686 573 0.1058 2.0722 574 0.1082 2.0758 575 0.1101 2.0794 576 0.1195 2.0830 577 0.1154 2.0866 578 0.1105 2.0903 579 0.1142 2.0939 580 0.1225 2.0975 581 0.1177 2.1011 582 0.1043 2.1047 583 0.1093 2.1083 584 0.1135 2.1119 585 0.1033 2.1155 586 0.1059 2.1191 587 0.1093 2.1227 588 0.1244 2.1264 589 0.1078 2.1300 590 0.1054 2.1336 591 0.1104 2.1372 592 0.1088 2.1408 593 0.1119 2.1444 594 0.1147 2.1480 595 0.1097 2.1516 596 0.113 2.1552 597 0.1069 2.1588 598 0.1171 2.1625 599 0.1042 2.1661 600 0.1171 2.1697 601 0.1068 2.1733 602 0.1044 2.1769 603 0.1128 2.1805 604 0.1155 2.1841 605 0.1164 2.1877 606 0.1121 2.1913 607 0.1061 2.1949 608 0.1104 2.1986 609 0.1136 2.2022 610 0.1106 2.2058 611 0.1088 2.2094 612 0.117 2.2130 613 0.1065 2.2166 614 0.1168 2.2202 615 0.1164 2.2238 616 0.1125 2.2274 617 0.1102 2.2310 618 0.1081 2.2347 619 0.1214 2.2383 620 0.1065 2.2419 621 0.1238 2.2455 622 0.1076 2.2491 623 0.11 2.2527 624 0.1208 2.2563 625 0.1168 2.2599 626 0.1118 2.2635 627 0.1106 2.2671 628 0.1101 2.2708 629 0.1153 2.2744 630 0.1169 2.2780 631 0.1086 2.2816 632 0.111 2.2852 633 0.1142 2.2888 634 0.1033 2.2924 635 0.1155 2.2960 636 0.1067 2.2996 637 0.1166 2.3032 638 0.1056 2.3069 639 0.1097 2.3105 640 0.1222 2.3141 641 0.1149 2.3177 642 0.1151 2.3213 643 0.1162 2.3249 644 0.1202 2.3285 645 0.1203 2.3321 646 0.1085 2.3357 647 0.1004 2.3394 648 0.1165 2.3430 649 0.1188 2.3466 650 0.1168 2.3502 651 0.1111 2.3538 652 0.1119 2.3574 653 0.1124 2.3610 654 0.1176 2.3646 655 0.1071 2.3682 656 0.1101 2.3718 657 0.1097 2.3755 658 0.1096 2.3791 659 0.1039 2.3827 660 0.1129 2.3863 661 0.1097 2.3899 662 0.1116 2.3935 663 0.1037 2.3971 664 0.1264 2.4007 665 0.0979 2.4043 666 0.1082 2.4079 667 0.1069 2.4116 668 0.1158 2.4152 669 0.1123 2.4188 670 0.126 2.4224 671 0.1133 2.4260 672 0.116 Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.4.1
- Transformers: 4.49.0
- PyTorch: 2.5.1+cu124
- Accelerate: 1.4.0
- Datasets: 3.3.2
- Tokenizers: 0.21.0
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", }
- Loss:
- Downloads last month
- 4
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for seongil-dn/bge-m3-2600-steps-672
Base model
seongil-dn/unsupervised_20m_2600