metadata
base_model: BAAI/bge-base-en-v1.5
datasets: []
language:
- en
library_name: sentence-transformers
license: apache-2.0
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:1500
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
widget:
- source_sentence: >-
The depreciation and amortization expense for the year 2021 was recorded
at $3,103.
sentences:
- >-
In what sequence do the signature pages appear relative to the financial
documents in this report?
- What was the depreciation and amortization expense in 2021?
- >-
What was the net impact on other comprehensive income (loss), net of
tax, for the fiscal year ended March 31, 2023?
- source_sentence: 'Actual Asset Returns: U.S. Plans: (21.20)%, Non-U.S. Plans: (25.40)%.'
sentences:
- >-
What were the total other current liabilities for the fiscal year ending
in 2023 compared to 2022?
- >-
What was the percentage of proprietary brand product sales as part of
the front store revenues in 2023?
- >-
By how much did actual asset returns vary between U.S. and Non-U.S.
pension plans in 2023?
- source_sentence: >-
Intellectual property rights are important to Nike's brand, success, and
competitive position. The company strategically pursues protections of
these rights and vigorously protects them against third-party theft and
infringement.
sentences:
- >-
What types of legal issues are generally categorized under Commitments
and Contingencies in a Form 10-K?
- >-
What role does intellectual property play in Nike's competitive
position?
- How is the revenue from sales of Online-Hosted Service Games recognized?
- source_sentence: >-
Item 3, titled 'Legal Proceedings' in a 10-K filing, directs to Note 16
where specific information is further detailed in Item 8 of Part II.
sentences:
- How does Garmin manage the costs of manufacturing its products?
- What is indicated by Item 3, 'Legal Proceedings', in a 10-K filing?
- >-
How much did UnitedHealthcare's cash provided by operating activities
amount to in 2023?
- source_sentence: >-
During 2023, FedEx ranked 18th in FORTUNE magazine's 'World's Most Admired
Companies' list and maintained its position as the highest-ranked delivery
company on the list.
sentences:
- >-
What was the total depreciation and amortization expense for the company
in 2023?
- >-
What was the valuation allowance against deferred tax assets at the end
of 2023, and what changes may affect its realization?
- What recognition did FedEx receive from FORTUNE magazine in 2023?
model-index:
- name: BGE base Financial Matryoshka
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 768
type: dim_768
metrics:
- type: cosine_accuracy@1
value: 0.7766666666666666
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.86
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.89
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.9333333333333333
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.7766666666666666
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.2866666666666667
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.17799999999999996
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.09333333333333332
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.7766666666666666
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.86
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.89
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.9333333333333333
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.8519532537710081
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.8263650793650793
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.8285686593594938
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 512
type: dim_512
metrics:
- type: cosine_accuracy@1
value: 0.7566666666666667
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.87
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8933333333333333
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.9333333333333333
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.7566666666666667
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.29
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.17866666666666664
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.09333333333333332
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.7566666666666667
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.87
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8933333333333333
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.9333333333333333
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.8462349355848354
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.8183306878306877
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.8207466430359656
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 256
type: dim_256
metrics:
- type: cosine_accuracy@1
value: 0.76
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.86
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.89
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.9266666666666666
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.76
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.2866666666666666
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.17799999999999996
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.09266666666666666
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.76
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.86
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.89
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.9266666666666666
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.8433224215661056
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.8166931216931217
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.8190592083326618
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 128
type: dim_128
metrics:
- type: cosine_accuracy@1
value: 0.7066666666666667
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.84
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8633333333333333
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.91
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.7066666666666667
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.27999999999999997
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.17266666666666666
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.09099999999999998
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.7066666666666667
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.84
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8633333333333333
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.91
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.8099084142081584
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.7776230158730157
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7810311049771785
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 64
type: dim_64
metrics:
- type: cosine_accuracy@1
value: 0.6833333333333333
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.7933333333333333
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8366666666666667
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.88
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.6833333333333333
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.26444444444444437
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.1673333333333333
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.088
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.6833333333333333
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.7933333333333333
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8366666666666667
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.88
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.7796467165928374
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.7475780423280424
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.751941519893099
name: Cosine Map@100
BGE base Financial Matryoshka
This is a sentence-transformers model finetuned from BAAI/bge-base-en-v1.5. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: BAAI/bge-base-en-v1.5
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 768 tokens
- Similarity Function: Cosine Similarity
- Language: en
- License: apache-2.0
Model Sources
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
model = SentenceTransformer("adarshheg/bge-base-financial-matryoshka")
sentences = [
"During 2023, FedEx ranked 18th in FORTUNE magazine's 'World's Most Admired Companies' list and maintained its position as the highest-ranked delivery company on the list.",
'What recognition did FedEx receive from FORTUNE magazine in 2023?',
'What was the valuation allowance against deferred tax assets at the end of 2023, and what changes may affect its realization?',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
Evaluation
Metrics
Information Retrieval
Metric |
Value |
cosine_accuracy@1 |
0.7767 |
cosine_accuracy@3 |
0.86 |
cosine_accuracy@5 |
0.89 |
cosine_accuracy@10 |
0.9333 |
cosine_precision@1 |
0.7767 |
cosine_precision@3 |
0.2867 |
cosine_precision@5 |
0.178 |
cosine_precision@10 |
0.0933 |
cosine_recall@1 |
0.7767 |
cosine_recall@3 |
0.86 |
cosine_recall@5 |
0.89 |
cosine_recall@10 |
0.9333 |
cosine_ndcg@10 |
0.852 |
cosine_mrr@10 |
0.8264 |
cosine_map@100 |
0.8286 |
Information Retrieval
Metric |
Value |
cosine_accuracy@1 |
0.7567 |
cosine_accuracy@3 |
0.87 |
cosine_accuracy@5 |
0.8933 |
cosine_accuracy@10 |
0.9333 |
cosine_precision@1 |
0.7567 |
cosine_precision@3 |
0.29 |
cosine_precision@5 |
0.1787 |
cosine_precision@10 |
0.0933 |
cosine_recall@1 |
0.7567 |
cosine_recall@3 |
0.87 |
cosine_recall@5 |
0.8933 |
cosine_recall@10 |
0.9333 |
cosine_ndcg@10 |
0.8462 |
cosine_mrr@10 |
0.8183 |
cosine_map@100 |
0.8207 |
Information Retrieval
Metric |
Value |
cosine_accuracy@1 |
0.76 |
cosine_accuracy@3 |
0.86 |
cosine_accuracy@5 |
0.89 |
cosine_accuracy@10 |
0.9267 |
cosine_precision@1 |
0.76 |
cosine_precision@3 |
0.2867 |
cosine_precision@5 |
0.178 |
cosine_precision@10 |
0.0927 |
cosine_recall@1 |
0.76 |
cosine_recall@3 |
0.86 |
cosine_recall@5 |
0.89 |
cosine_recall@10 |
0.9267 |
cosine_ndcg@10 |
0.8433 |
cosine_mrr@10 |
0.8167 |
cosine_map@100 |
0.8191 |
Information Retrieval
Metric |
Value |
cosine_accuracy@1 |
0.7067 |
cosine_accuracy@3 |
0.84 |
cosine_accuracy@5 |
0.8633 |
cosine_accuracy@10 |
0.91 |
cosine_precision@1 |
0.7067 |
cosine_precision@3 |
0.28 |
cosine_precision@5 |
0.1727 |
cosine_precision@10 |
0.091 |
cosine_recall@1 |
0.7067 |
cosine_recall@3 |
0.84 |
cosine_recall@5 |
0.8633 |
cosine_recall@10 |
0.91 |
cosine_ndcg@10 |
0.8099 |
cosine_mrr@10 |
0.7776 |
cosine_map@100 |
0.781 |
Information Retrieval
Metric |
Value |
cosine_accuracy@1 |
0.6833 |
cosine_accuracy@3 |
0.7933 |
cosine_accuracy@5 |
0.8367 |
cosine_accuracy@10 |
0.88 |
cosine_precision@1 |
0.6833 |
cosine_precision@3 |
0.2644 |
cosine_precision@5 |
0.1673 |
cosine_precision@10 |
0.088 |
cosine_recall@1 |
0.6833 |
cosine_recall@3 |
0.7933 |
cosine_recall@5 |
0.8367 |
cosine_recall@10 |
0.88 |
cosine_ndcg@10 |
0.7796 |
cosine_mrr@10 |
0.7476 |
cosine_map@100 |
0.7519 |
Training Details
Training Dataset
Unnamed Dataset
- Size: 1,500 training samples
- Columns:
positive
and anchor
- Approximate statistics based on the first 1000 samples:
|
positive |
anchor |
type |
string |
string |
details |
- min: 6 tokens
- mean: 46.0 tokens
- max: 239 tokens
|
- min: 9 tokens
- mean: 20.82 tokens
- max: 42 tokens
|
- Samples:
positive |
anchor |
In the U.S., Visa Inc.'s total nominal payments volume increased by 17% from $4,725 billion in 2021 to $5,548 billion in 2022. |
What is the total percentage increase in Visa Inc.'s nominal payments volume in the U.S. from 2021 to 2022? |
The section titled 'Financial Wtatement and Supplementary Data' is labeled with the number 39 in the document. |
What is the numerical label associated with the section on Financial Statements and Supplementary Data in the document? |
The consolidated financial statements and accompanying notes are incorporated by reference herein. |
Are the consolidated financial statements and accompanying notes incorporated by reference in the Annual Report on Form 10-K? |
- Loss:
MatryoshkaLoss
with these parameters:{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: epoch
per_device_train_batch_size
: 32
per_device_eval_batch_size
: 16
gradient_accumulation_steps
: 16
learning_rate
: 2e-05
num_train_epochs
: 2
lr_scheduler_type
: cosine
warmup_ratio
: 0.1
tf32
: False
load_best_model_at_end
: True
optim
: adamw_torch_fused
batch_sampler
: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir
: False
do_predict
: False
eval_strategy
: epoch
prediction_loss_only
: True
per_device_train_batch_size
: 32
per_device_eval_batch_size
: 16
per_gpu_train_batch_size
: None
per_gpu_eval_batch_size
: None
gradient_accumulation_steps
: 16
eval_accumulation_steps
: None
learning_rate
: 2e-05
weight_decay
: 0.0
adam_beta1
: 0.9
adam_beta2
: 0.999
adam_epsilon
: 1e-08
max_grad_norm
: 1.0
num_train_epochs
: 2
max_steps
: -1
lr_scheduler_type
: cosine
lr_scheduler_kwargs
: {}
warmup_ratio
: 0.1
warmup_steps
: 0
log_level
: passive
log_level_replica
: warning
log_on_each_node
: True
logging_nan_inf_filter
: True
save_safetensors
: True
save_on_each_node
: False
save_only_model
: False
restore_callback_states_from_checkpoint
: False
no_cuda
: False
use_cpu
: False
use_mps_device
: False
seed
: 42
data_seed
: None
jit_mode_eval
: False
use_ipex
: False
bf16
: False
fp16
: False
fp16_opt_level
: O1
half_precision_backend
: auto
bf16_full_eval
: False
fp16_full_eval
: False
tf32
: False
local_rank
: 0
ddp_backend
: None
tpu_num_cores
: None
tpu_metrics_debug
: False
debug
: []
dataloader_drop_last
: False
dataloader_num_workers
: 0
dataloader_prefetch_factor
: None
past_index
: -1
disable_tqdm
: False
remove_unused_columns
: True
label_names
: None
load_best_model_at_end
: True
ignore_data_skip
: False
fsdp
: []
fsdp_min_num_params
: 0
fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
fsdp_transformer_layer_cls_to_wrap
: None
accelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
deepspeed
: None
label_smoothing_factor
: 0.0
optim
: adamw_torch_fused
optim_args
: None
adafactor
: False
group_by_length
: False
length_column_name
: length
ddp_find_unused_parameters
: None
ddp_bucket_cap_mb
: None
ddp_broadcast_buffers
: False
dataloader_pin_memory
: True
dataloader_persistent_workers
: False
skip_memory_metrics
: True
use_legacy_prediction_loop
: False
push_to_hub
: False
resume_from_checkpoint
: None
hub_model_id
: None
hub_strategy
: every_save
hub_private_repo
: False
hub_always_push
: False
gradient_checkpointing
: False
gradient_checkpointing_kwargs
: None
include_inputs_for_metrics
: False
eval_do_concat_batches
: True
fp16_backend
: auto
push_to_hub_model_id
: None
push_to_hub_organization
: None
mp_parameters
:
auto_find_batch_size
: False
full_determinism
: False
torchdynamo
: None
ray_scope
: last
ddp_timeout
: 1800
torch_compile
: False
torch_compile_backend
: None
torch_compile_mode
: None
dispatch_batches
: None
split_batches
: None
include_tokens_per_second
: False
include_num_input_tokens_seen
: False
neftune_noise_alpha
: None
optim_target_modules
: None
batch_eval_metrics
: False
batch_sampler
: no_duplicates
multi_dataset_batch_sampler
: proportional
Training Logs
Epoch |
Step |
dim_128_cosine_map@100 |
dim_256_cosine_map@100 |
dim_512_cosine_map@100 |
dim_64_cosine_map@100 |
dim_768_cosine_map@100 |
0.6809 |
2 |
0.7796 |
0.8153 |
0.8165 |
0.7375 |
0.8186 |
1.3617 |
4 |
0.781 |
0.8191 |
0.8207 |
0.7519 |
0.8286 |
- The bold row denotes the saved checkpoint.
Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.0.1
- Transformers: 4.41.2
- PyTorch: 2.1.2+cu121
- Accelerate: 0.33.0
- Datasets: 2.19.1
- Tokenizers: 0.19.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MatryoshkaLoss
@misc{kusupati2024matryoshka,
title={Matryoshka Representation Learning},
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
year={2024},
eprint={2205.13147},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}