metadata
base_model: sentence-transformers/all-mpnet-base-v2
library_name: sentence-transformers
metrics:
- pearson_cosine
- spearman_cosine
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:2353
- loss:CosineSimilarityLoss
widget:
- source_sentence: >-
A year has passed since "The Black Rebellion" and the remaining Black
Knights have vanished into the shadows, their leader and figurehead, Zero,
executed by the Britannian Empire. Area 11 is once more squirming under
the Emperors oppressive heel as the Britannian armies concentrate their
attacks on the European front. But for the Britannians living in Area 11,
life is back to normal. On one such normal day, a Britannian student,
skipping his classes in the Ashford Academy, sneaks out to gamble on his
chess play. But unknown to this young man, several forces are eying him
from the shadows, for soon, he will experience a shocking encounter with
his own obscured past, and the masked rebel mastermind Zero will return.
sentences:
- Politics
- Mythology
- Disability
- source_sentence: >-
In a land where corruption rules and a ruthless Prime Minister has turned
the puppet Emperors armies of soldiers, assassins and secret police
against the people, only one force dares to stand against them: Night
Raid, an elite team of relentless killers, each equipped with an Imperial
Arm - legendary weapons with unique and incredible powers created in the
distant past.
sentences:
- Kuudere
- Tragedy
- Seinen
- source_sentence: >-
Theres a rumor about a mysterious phenomenon called "puberty syndrome."
For example, Sakuta Azusagawa is a high school student who suddenly sees a
bunny girl appear in front of him. The girl is actually a girl named Mai
Sakurajima, who is Sakutas upperclassman who is also a famous actress who
has gone on hiatus from the entertainment industry. For some reason, the
people around Mai cannot see her bunny-girl figure. Sakuta sets out to
solve this mystery, and as he spends time with Mai, he learns her secret
feelings. Other heroines who have "puberty syndrome" start to appear in
front of Sakuta.
sentences:
- Heterosexual
- Drama
- Episodic
- source_sentence: >-
Dororo, a young orphan thief, meets Hyakkimaru, a powerful ronin.
Hyakkimarus father, a greedy feudal lord, had made a pact with 12 demons,
offering his yet-unborn sons body parts in exchange for great power. Thus,
Hyakkimaru - who was born without arms, legs, eyes, ears, a nose or a
mouth - was abandoned in a river as a baby. Rescued and raised by Dr.
Honma, who equips him with artificial limbs and teaches him sword-fighting
techniques, Hyakkimaru discovers that each time he slays a demon, a piece
of his body is restored. Now, he roams the war-torn countryside in search
of demons.
sentences:
- Urban
- Heterosexual
- Demons
- source_sentence: Everyone has a part of themselves they cannot show to anyone else.
sentences:
- Transgender
- Crime
- Comedy
model-index:
- name: SentenceTransformer based on sentence-transformers/all-mpnet-base-v2
results:
- task:
type: semantic-similarity
name: Semantic Similarity
dataset:
name: anime recommendation dev
type: anime-recommendation-dev
metrics:
- type: pearson_cosine
value: 0.6144532877889222
name: Pearson Cosine
- type: spearman_cosine
value: 0.6215240802205049
name: Spearman Cosine
- task:
type: semantic-similarity
name: Semantic Similarity
dataset:
name: anime recommendation test
type: anime-recommendation-test
metrics:
- type: pearson_cosine
value: 0.6535704432727567
name: Pearson Cosine
- type: spearman_cosine
value: 0.6393952594394526
name: Spearman Cosine
SentenceTransformer based on sentence-transformers/all-mpnet-base-v2
This is a sentence-transformers model finetuned from sentence-transformers/all-mpnet-base-v2. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: sentence-transformers/all-mpnet-base-v2
- Maximum Sequence Length: 384 tokens
- Output Dimensionality: 768 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 384, 'do_lower_case': False}) with Transformer model: MPNetModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Prashasst/anime-recommendation-model")
# Run inference
sentences = [
'I want anime like onepiece.',
'Pirates',
'Action',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Evaluation
Metrics
Semantic Similarity
- Datasets:
anime-recommendation-dev
andanime-recommendation-test
- Evaluated with
EmbeddingSimilarityEvaluator
Metric | anime-recommendation-dev | anime-recommendation-test |
---|---|---|
pearson_cosine | 0.6145 | 0.6536 |
spearman_cosine | 0.6215 | 0.6394 |
Training Details
Training Dataset
Unnamed Dataset
- Size: 2,353 training samples
- Columns:
description
,genre
, andlabel
- Approximate statistics based on the first 1000 samples:
description genre label type string string float details - min: 15 tokens
- mean: 97.39 tokens
- max: 193 tokens
- min: 3 tokens
- mean: 3.82 tokens
- max: 8 tokens
- min: 0.1
- mean: 0.71
- max: 1.0
- Samples:
description genre label Mitsuha Miyamizu, a high school girl, yearns to live the life of a boy in the bustling city of Tokyo—a dream that stands in stark contrast to her present life in the countryside. Meanwhile in the city, Taki Tachibana lives a busy life as a high school student while juggling his part-time job and hopes for a future in architecture.
Environmental
0.6
Jinta Yadomi and his group of childhood friends have become estranged after a tragic accident split them apart. Now in their high school years, a sudden surprise forces each of them to confront their guilt over what happened that day and come to terms with the ghosts of their past.
Hikikomori
0.79
The second season of Ansatsu Kyoushitsu.
Episodic
0.44
- Loss:
CosineSimilarityLoss
with these parameters:{ "loss_fct": "torch.nn.modules.loss.MSELoss" }
Evaluation Dataset
Unnamed Dataset
- Size: 294 evaluation samples
- Columns:
description
,genre
, andlabel
- Approximate statistics based on the first 294 samples:
description genre label type string string float details - min: 15 tokens
- mean: 92.48 tokens
- max: 193 tokens
- min: 3 tokens
- mean: 3.73 tokens
- max: 8 tokens
- min: 0.1
- mean: 0.69
- max: 1.0
- Samples:
description genre label Summer is here, and the heroes of Class 1-A and 1-B are in for the toughest training camp of their lives A group of seasoned pros pushes everyones Quirks to new heights as the students face one overwhelming challenge after another. Braving the elements in this secret location becomes the least of their worries when routine training turns into a critical struggle for survival.
Transgender
0.2
"In order for something to be obtained, something of equal value must be lost."
Cyborg
0.72
In the story, Subaru Natsuki is an ordinary high school student who is lost in an alternate world, where he is rescued by a beautiful, silver-haired girl. He stays near her to return the favor, but the destiny she is burdened with is more than Subaru can imagine. Enemies attack one by one, and both of them are killed. He then finds out he has the power to rewind death, back to the time he first came to this world. But only he remembers what has happened since.
Primarily Female Cast
0.61
- Loss:
CosineSimilarityLoss
with these parameters:{ "loss_fct": "torch.nn.modules.loss.MSELoss" }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 16learning_rate
: 2e-05num_train_epochs
: 1warmup_ratio
: 0.1fp16
: True
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 16per_device_eval_batch_size
: 8per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 2e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 1max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.1warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Truefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Falsehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseeval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseeval_use_gather_object
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: proportional
Training Logs
Click to expand
Epoch | Step | Training Loss | Validation Loss | anime-recommendation-dev_spearman_cosine | anime-recommendation-test_spearman_cosine |
---|---|---|---|---|---|
0.0068 | 1 | 0.3882 | - | - | - |
0.0135 | 2 | 0.2697 | - | - | - |
0.0203 | 3 | 0.2648 | - | - | - |
0.0270 | 4 | 0.3022 | - | - | - |
0.0338 | 5 | 0.2665 | - | - | - |
0.0405 | 6 | 0.2923 | - | - | - |
0.0473 | 7 | 0.3165 | - | - | - |
0.0541 | 8 | 0.2069 | - | - | - |
0.0608 | 9 | 0.271 | - | - | - |
0.0676 | 10 | 0.1974 | - | - | - |
0.0743 | 11 | 0.156 | - | - | - |
0.0811 | 12 | 0.1035 | - | - | - |
0.0878 | 13 | 0.1046 | - | - | - |
0.0946 | 14 | 0.0579 | - | - | - |
0.1014 | 15 | 0.0904 | - | - | - |
0.1081 | 16 | 0.0734 | - | - | - |
0.1149 | 17 | 0.0396 | - | - | - |
0.1216 | 18 | 0.0219 | - | - | - |
0.1284 | 19 | 0.0672 | - | - | - |
0.1351 | 20 | 0.0567 | - | - | - |
0.1419 | 21 | 0.0969 | - | - | - |
0.1486 | 22 | 0.0258 | - | - | - |
0.1554 | 23 | 0.1174 | - | - | - |
0.1622 | 24 | 0.0334 | - | - | - |
0.1689 | 25 | 0.0661 | - | - | - |
0.1757 | 26 | 0.0365 | - | - | - |
0.1824 | 27 | 0.049 | - | - | - |
0.1892 | 28 | 0.0889 | - | - | - |
0.1959 | 29 | 0.0179 | - | - | - |
0.2027 | 30 | 0.0255 | - | - | - |
0.2095 | 31 | 0.0312 | - | - | - |
0.2162 | 32 | 0.0312 | - | - | - |
0.2230 | 33 | 0.0619 | - | - | - |
0.2297 | 34 | 0.0358 | - | - | - |
0.2365 | 35 | 0.0468 | - | - | - |
0.2432 | 36 | 0.0601 | - | - | - |
0.25 | 37 | 0.0546 | - | - | - |
0.2568 | 38 | 0.0411 | - | - | - |
0.2635 | 39 | 0.0332 | - | - | - |
0.2703 | 40 | 0.0479 | - | - | - |
0.2770 | 41 | 0.0657 | - | - | - |
0.2838 | 42 | 0.0161 | - | - | - |
0.2905 | 43 | 0.0323 | - | - | - |
0.2973 | 44 | 0.0794 | - | - | - |
0.3041 | 45 | 0.0264 | - | - | - |
0.3108 | 46 | 0.0391 | - | - | - |
0.3176 | 47 | 0.0514 | - | - | - |
0.3243 | 48 | 0.0276 | - | - | - |
0.3311 | 49 | 0.0653 | - | - | - |
0.3378 | 50 | 0.0343 | - | - | - |
0.3446 | 51 | 0.0369 | - | - | - |
0.3514 | 52 | 0.0336 | - | - | - |
0.3581 | 53 | 0.0368 | - | - | - |
0.3649 | 54 | 0.0477 | - | - | - |
0.3716 | 55 | 0.0358 | - | - | - |
0.3784 | 56 | 0.0312 | - | - | - |
0.3851 | 57 | 0.0388 | - | - | - |
0.3919 | 58 | 0.0415 | - | - | - |
0.3986 | 59 | 0.02 | - | - | - |
0.4054 | 60 | 0.0459 | - | - | - |
0.4122 | 61 | 0.0302 | - | - | - |
0.4189 | 62 | 0.0519 | - | - | - |
0.4257 | 63 | 0.0283 | - | - | - |
0.4324 | 64 | 0.04 | - | - | - |
0.4392 | 65 | 0.0146 | - | - | - |
0.4459 | 66 | 0.033 | - | - | - |
0.4527 | 67 | 0.0365 | - | - | - |
0.4595 | 68 | 0.0579 | - | - | - |
0.4662 | 69 | 0.0253 | - | - | - |
0.4730 | 70 | 0.033 | - | - | - |
0.4797 | 71 | 0.0258 | - | - | - |
0.4865 | 72 | 0.0181 | - | - | - |
0.4932 | 73 | 0.0334 | - | - | - |
0.5 | 74 | 0.0415 | - | - | - |
0.5068 | 75 | 0.0258 | - | - | - |
0.5135 | 76 | 0.0304 | - | - | - |
0.5203 | 77 | 0.0211 | - | - | - |
0.5270 | 78 | 0.0334 | - | - | - |
0.5338 | 79 | 0.0278 | - | - | - |
0.5405 | 80 | 0.0209 | - | - | - |
0.5473 | 81 | 0.0391 | - | - | - |
0.5541 | 82 | 0.0274 | - | - | - |
0.5608 | 83 | 0.0213 | - | - | - |
0.5676 | 84 | 0.0293 | - | - | - |
0.5743 | 85 | 0.0205 | - | - | - |
0.5811 | 86 | 0.0258 | - | - | - |
0.5878 | 87 | 0.0262 | - | - | - |
0.5946 | 88 | 0.0109 | - | - | - |
0.6014 | 89 | 0.0268 | - | - | - |
0.6081 | 90 | 0.0304 | - | - | - |
0.6149 | 91 | 0.0328 | - | - | - |
0.6216 | 92 | 0.0173 | - | - | - |
0.6284 | 93 | 0.0253 | - | - | - |
0.6351 | 94 | 0.0245 | - | - | - |
0.6419 | 95 | 0.0232 | - | - | - |
0.6486 | 96 | 0.0309 | - | - | - |
0.6554 | 97 | 0.0209 | - | - | - |
0.6622 | 98 | 0.0169 | - | - | - |
0.6689 | 99 | 0.024 | - | - | - |
0.6757 | 100 | 0.0166 | 0.0284 | 0.6215 | - |
0.6824 | 101 | 0.0202 | - | - | - |
0.6892 | 102 | 0.0181 | - | - | - |
0.6959 | 103 | 0.0413 | - | - | - |
0.7027 | 104 | 0.0537 | - | - | - |
0.7095 | 105 | 0.0241 | - | - | - |
0.7162 | 106 | 0.0199 | - | - | - |
0.7230 | 107 | 0.0227 | - | - | - |
0.7297 | 108 | 0.0283 | - | - | - |
0.7365 | 109 | 0.0372 | - | - | - |
0.7432 | 110 | 0.0193 | - | - | - |
0.75 | 111 | 0.0147 | - | - | - |
0.7568 | 112 | 0.0594 | - | - | - |
0.7635 | 113 | 0.0185 | - | - | - |
0.7703 | 114 | 0.0674 | - | - | - |
0.7770 | 115 | 0.0212 | - | - | - |
0.7838 | 116 | 0.0268 | - | - | - |
0.7905 | 117 | 0.0233 | - | - | - |
0.7973 | 118 | 0.0276 | - | - | - |
0.8041 | 119 | 0.0242 | - | - | - |
0.8108 | 120 | 0.034 | - | - | - |
0.8176 | 121 | 0.0231 | - | - | - |
0.8243 | 122 | 0.0252 | - | - | - |
0.8311 | 123 | 0.0294 | - | - | - |
0.8378 | 124 | 0.0205 | - | - | - |
0.8446 | 125 | 0.0302 | - | - | - |
0.8514 | 126 | 0.0468 | - | - | - |
0.8581 | 127 | 0.0311 | - | - | - |
0.8649 | 128 | 0.0365 | - | - | - |
0.8716 | 129 | 0.0257 | - | - | - |
0.8784 | 130 | 0.0339 | - | - | - |
0.8851 | 131 | 0.0359 | - | - | - |
0.8919 | 132 | 0.0404 | - | - | - |
0.8986 | 133 | 0.0223 | - | - | - |
0.9054 | 134 | 0.0232 | - | - | - |
0.9122 | 135 | 0.0295 | - | - | - |
0.9189 | 136 | 0.0244 | - | - | - |
0.9257 | 137 | 0.0168 | - | - | - |
0.9324 | 138 | 0.0319 | - | - | - |
0.9392 | 139 | 0.0328 | - | - | - |
0.9459 | 140 | 0.0295 | - | - | - |
0.9527 | 141 | 0.0262 | - | - | - |
0.9595 | 142 | 0.0238 | - | - | - |
0.9662 | 143 | 0.0181 | - | - | - |
0.9730 | 144 | 0.017 | - | - | - |
0.9797 | 145 | 0.0244 | - | - | - |
0.9865 | 146 | 0.0264 | - | - | - |
0.9932 | 147 | 0.0194 | - | - | - |
1.0 | 148 | 0.0028 | - | - | 0.6394 |
Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.3.1
- Transformers: 4.44.2
- PyTorch: 2.4.1+cu121
- Accelerate: 0.34.2
- Datasets: 3.2.0
- Tokenizers: 0.19.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}