--- base_model: sentence-transformers/all-MiniLM-L6-v2 datasets: [] language: [] library_name: sentence-transformers metrics: - cosine_accuracy - dot_accuracy - manhattan_accuracy - euclidean_accuracy - max_accuracy pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:2320 - loss:MultipleNegativesRankingLoss widget: - source_sentence: DENNIE FOSTE Men's Poly Cotton Washed Light Blue Jeans(DF-JNS-015) sentences: - https://www.amazon.in/dp/B0BZDFGSCR - DENNIE FOSTE presents this streachable fabric Polycotton jeans. It's good quality fabric would certainly make you feel good and confident when you wear it. Comfortable front pockets, comfortable back pockets, highly durable and stretchable jeans for man. Perfect for casual, beach parties wear high on style and quality, these stretchable jeans are as versatile as they are comfortable. Wear it with a casual tee for a smart look. Wear it casually and be at ease throughout the day or it can also blend to perfection on your special ocassions. - urbano fashion mens slim fit jeans - source_sentence: ZESICA Women's 2023 Summer Bohemian Solid Color Lace Trim Flowy A Line Beach Long Maxi Skirt with Pockets sentences: - aratlench acrylic pendant necklace earrings – long statement leaf charm necklace tortoise resin palm leaf earrings fashion necklaces earrings for women girls - https://www.amazon.com/dp/B09X19HV5D - zesica womens 2023 summer bohemian solid color lace trim flowy a line beach long maxi skirt with pockets - source_sentence: DHRUVI TRENDZ Men's Shirts || Rayon Tropical Printed Shirts for Men || Summer Wear Shirt for Men || Perfect for Outing || Vacation || DateWear Shirt for Boys || Gift for Men sentences: - om sai latest creation shirt for men rayon shirts for men tropical leaf printed short sleeve spread collar shirts for boy casual beach wear festive shirt for men - https://www.amazon.in/dp/B0C18PR364 - Men's Fashion Products Are Our partywear outfit collection for men includes a shirt neckline, Short-sleeves, and a button placket on the front. Perfect Regular Fit with Best Look. simple spread collar and soft felt in the fabric which makes the shirt very easy and comfortable to wear casually. From the newest designs and trendiest styles for men we are making fashionable clothing affordable. Shirts feel soft and light on the body. Pairing with the right colored denim we can imagine the outfit is best suited for dining parties and night outs. Our men's Tropical shirts are made of the Best fabric which is lightweight and breathable. Perfect for summer and hot weather keeps your body dry and comfortable all day. This casual summer shirts design with a Fancy Hawaii collar, short sleeve, botton down, Tropical print and classic regular fit. This beach shirts with multiple unique color and pattern, each of which is a unique experience, make you shine this summer. Perfect gift for yourself, families, or friends. Perfect for camp, sun beach, birthday party, vacation, bachelor party, cruise, camp, or any casual daily wear. - source_sentence: Molie Bridal Austrian Crystal Necklace and Earrings Jewelry Set Gifts fit with Wedding Dress sentences: - You should have this jewelry set near you all the time since it is so fashion and eye-catching. You can wear it and have it with you to support you wherever you go. Make a statement with this wonderful jewelry set. Molie Molie has been found for many years, referred to "Molie", which denotes to treat all of the world's women like an Molie jewelry and meet their fantasies and satisfactions. We have our own factory to ensure our items' plating and the strict criteria of the plating thickness. The physical characteristics of human require us to adopt a higher standard of plating process. At the same time, it create a good condition to reduce production cost while maintain high quality of our item. Moreover, We are committed to provide customers with competitive products and best customer services, since its inception has been its high quality themselves, stylish design, superb manufacturing process. Besides, we concentrate on improving the service based on the creative, showing brand attributes. All in all, we take Customers' satisfactions as our first priority. - https://www.amazon.com/dp/B071VM3BKW - coofandy mens short sleeve hoodie relaxed fit fashion casual sweatshirts lightweight hip hop streetwear t shirts - source_sentence: Steve Madden Clutch Crossbody sentences: - https://www.amazon.com/dp/B07VCDT9VR - See and BSCENE with this Clear bag. Carry it as a crossbody or clutch. The exterior is Clear and includes an internal pouch. - womens dezier mens regular shirt 6032sformal1110multicolor extra large model-index: - name: SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2 results: - task: type: triplet name: Triplet dataset: name: Unknown type: unknown metrics: - type: cosine_accuracy value: 1.0 name: Cosine Accuracy - type: dot_accuracy value: 0.0 name: Dot Accuracy - type: manhattan_accuracy value: 1.0 name: Manhattan Accuracy - type: euclidean_accuracy value: 1.0 name: Euclidean Accuracy - type: max_accuracy value: 1.0 name: Max Accuracy --- # SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2 This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co./sentence-transformers/all-MiniLM-L6-v2). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co./sentence-transformers/all-MiniLM-L6-v2) - **Maximum Sequence Length:** 256 tokens - **Output Dimensionality:** 384 tokens - **Similarity Function:** Cosine Similarity ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co./models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("sentence_transformers_model_id") # Run inference sentences = [ 'Steve Madden Clutch Crossbody', 'See and BSCENE with this Clear bag. Carry it as a crossbody or clutch. The exterior is Clear and includes an internal pouch.', 'https://www.amazon.com/dp/B07VCDT9VR', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 384] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` ## Evaluation ### Metrics #### Triplet * Evaluated with [TripletEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator) | Metric | Value | |:-------------------|:--------| | cosine_accuracy | 1.0 | | dot_accuracy | 0.0 | | manhattan_accuracy | 1.0 | | euclidean_accuracy | 1.0 | | **max_accuracy** | **1.0** | ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 2,320 training samples * Columns: anchor, positive, and negative * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | string | | details | | | | * Samples: | anchor | positive | negative | |:--------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------| | Shiaili Classic Plus Size Skirts for Women Flowy Pleated Midi Length Skirt | shiaili classic plus size skirts for women flowy pleated midi length skirt | https://www.amazon.com/dp/B0BMTRJRG6 | | ANRABESS Women's Casual Long Sleeve Draped Open Front Knit Pockets Long Cardigan Jackets Sweater | anrabess womens casual long sleeve draped open front knit pockets long cardigan jackets sweater | https://www.amazon.com/dp/B0B2W6QGYB | | RipSkirt Hawaii | Length 2 with Pockets | Quick Wrap, Quick Dry, Travel Skirt with Side Pockets | RipSkirt Hawaii is the active woman’s perfect skirt. Wear your RipSkirt straight from the beach to the bistro, we’ve got you covered. Our custom fabric doesn’t cling, flatters almost every figure, repels water, and dries quickly if soaked. [no more wet bum marks when leaving the pool] Length 2 is our most popular length and is perfect for work, play, and around town and has side pockets deep enough for a large phone. Content: 93% polyester 7% spandex | https://www.amazon.com/dp/B09X714HBM | * Loss: [MultipleNegativesRankingLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Evaluation Dataset #### Unnamed Dataset * Size: 580 evaluation samples * Columns: anchor, positive, and negative * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | | | | * Samples: | anchor | positive | negative | |:---------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------| | Hotouch Lightweight Crochet Cardigan for Women Long Sleeve Open Front Knit Oversized Cardigans Sweaters | hotouch lightweight crochet cardigan for women long sleeve open front knit oversized cardigans sweaters | https://www.amazon.com/dp/B0C1FM1JDZ | | SEIKO Men's SNK809 5 Automatic Stainless Steel Watch with Black Canvas Strap | Black dial. Silver-tone stainless steel case with a black canvas band. Automatic movement. 30 meters / 100 feet water resistance. Fixed bezel. Tang clasp. Case size 37 mm x 11 mm. Seiko SNK809 Seiko 5 Watch.The Seiko 5 Men's Automatic Black Strap Black Dial Watch is a stylish timepiece with the convenience of automatic movement. A uniquely designed, black dial features white Arabic numbers marking the hours on an inner circle and the minutes on an outer circle, while small, bar indexes encircle the dial on an outside minute track. Silver-tone hands with luminous fill make it easy to tell time day or night, and the slim second hand is detailed with a red accent. For added convenience, a day and date display are set at three o'clock. The polished stainless steel case extends to meet the black nylon strap, which wraps comfortably around the wrist and fastens with a traditional buckle. Water resistant to 30 feet (100 meters), this high-performance watch is perfect for everyday wear.This is an automatic mechanical watch. Automatic watches do not operate on batteries, instead, they are powered automatically by the movement of the wearer’s arm. If the main spring in your automatic watch is not wound sufficiently, timekeeping may become less accurate. In order to maintain accuracy, wear the watch for 8 hours or more per day, or manually wind the main spring by turning the crown. When not in use, automatic watches may be kept charged with an automatic watch winder – a watch storage unit which may be purchased separately. From Humble beginnings, Kintaro Hattori’s Vision for Seiko has become reality. A consuming passion for excellence - imprinted in our Corporate DNA passed from generation to generation. Seiko, for 125 years committed to the art and science of time. A culture of innovation connects a 19th century Tokyo clock shop with 20th century advances in timekeeping to an extraordinary 21st century "quiet revolution." Continually driven by dedication and passion, established a multitude of world’s first technologies… transforming the principles of timekeeping. The first quartz wristwatch – changed the history of time. The first Kinetic – marked a new era in quartz watch technology. In 1969, Seiko Astron, the first quartz wristwatch - was introduced. In an instant, Seiko exponentially improved the accuracy of wristwatches –And Seiko technology firmly established today’s standard in Olympic and sports timing. 1984, another celebrated first – Kinetic Technology – powered by body movement. Kinetic – a quartz mechanism with unparalleled accuracy –the driving force behind more world’s firsts. Kinetic Chronograph – the next generation of high performance timekeeping. Kinetic Auto Relay – automatically resets to the correct time. Kinetic Perpetual - combining the date perfect technology of perpetual calendar with the genius of Kinetic Auto Relay. And now Kinetic Direct Drive – move, and the watch is powered automatically. Or hand wind it and see the power you are generating in real time. In the realm of fine watches, time is measured by Seiko innovation – A heritage of dedication to the art and science of time.See more | https://www.amazon.com/dp/B002SSUQFG | | Carhartt Men's Rain Defender Loose Fit Midweight Thermal-Lined Full-Zip Sweatshirt | This men's full-zip sweatshirt is equipped for light rain. Made from midweight fleece with a water-repellent finish and thermal lining. Features inner and outer pockets that include storage for your phone. 10.5-ounce, 50% cotton / 50% polyester fleece. Polyester fleece lining for warmth. Rain Defender® durable water repellent (DWR) keeps you dry and moving in light rain. Original fit. Full-zip front with brass zipper. Attached, thermal-lined three-piece hood with drawcord closure. Spandex-reinforced rib-knit cuffs and waist help keep out the cold. Two front handwarmer pockets with flaps for added security. Hidden media pocket. Inside pocket with zipper closure. Locker loop. | https://www.amazon.com/dp/B08BG5V4KR | * Loss: [MultipleNegativesRankingLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: epoch - `per_device_eval_batch_size`: 16 - `learning_rate`: 3e-05 - `lr_scheduler_type`: cosine - `warmup_ratio`: 0.1 - `load_best_model_at_end`: True - `ddp_find_unused_parameters`: False #### All Hyperparameters
Click to expand - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: epoch - `prediction_loss_only`: True - `per_device_train_batch_size`: 8 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `learning_rate`: 3e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 3 - `max_steps`: -1 - `lr_scheduler_type`: cosine - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: True - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: False - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `batch_sampler`: batch_sampler - `multi_dataset_batch_sampler`: proportional
### Training Logs | Epoch | Step | Training Loss | loss | max_accuracy | |:------:|:----:|:-------------:|:------:|:------------:| | 0.0862 | 25 | 0.3631 | - | - | | 0.1724 | 50 | 0.1219 | - | - | | 0.2586 | 75 | 0.1909 | - | - | | 0.3448 | 100 | 0.24 | - | - | | 0.4310 | 125 | 0.1607 | - | - | | 0.5172 | 150 | 0.1103 | - | - | | 0.6034 | 175 | 0.0952 | - | - | | 0.6897 | 200 | 0.1139 | - | - | | 0.7759 | 225 | 0.1335 | - | - | | 0.8621 | 250 | 0.0758 | - | - | | 0.9483 | 275 | 0.0902 | - | - | | 1.0 | 290 | - | 0.0700 | 1.0 | | 1.0345 | 300 | 0.0951 | - | - | | 1.1207 | 325 | 0.0373 | - | - | | 1.2069 | 350 | 0.086 | - | - | | 1.2931 | 375 | 0.0418 | - | - | | 1.3793 | 400 | 0.0522 | - | - | | 1.4655 | 425 | 0.0387 | - | - | | 1.5517 | 450 | 0.0217 | - | - | | 1.6379 | 475 | 0.0455 | - | - | | 1.7241 | 500 | 0.0424 | - | - | | 1.8103 | 525 | 0.0238 | - | - | | 1.8966 | 550 | 0.0355 | - | - | | 1.9828 | 575 | 0.0283 | - | - | | 2.0 | 580 | - | 0.0597 | 1.0 | | 2.0690 | 600 | 0.0213 | - | - | | 2.1552 | 625 | 0.0219 | - | - | | 2.2414 | 650 | 0.0254 | - | - | | 2.3276 | 675 | 0.0204 | - | - | | 2.4138 | 700 | 0.0052 | - | - | | 2.5 | 725 | 0.0248 | - | - | | 2.5862 | 750 | 0.0507 | - | - | | 2.6724 | 775 | 0.0191 | - | - | | 2.7586 | 800 | 0.018 | - | - | | 2.8448 | 825 | 0.0176 | - | - | | 2.9310 | 850 | 0.0193 | - | - | | 3.0 | 870 | - | 0.0566 | 1.0 | ### Framework Versions - Python: 3.10.14 - Sentence Transformers: 3.0.1 - Transformers: 4.42.2 - PyTorch: 2.3.0 - Accelerate: 0.31.0 - Datasets: 2.19.1 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```