Ananthu357's picture
Add new SentenceTransformer model.
8321e0f verified
metadata
base_model: BAAI/bge-large-en
datasets: []
language: []
library_name: sentence-transformers
pipeline_tag: sentence-similarity
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:574
  - loss:CosineSimilarityLoss
widget:
  - source_sentence: What is mentioned regarding the patent errors?
    sentences:
      - the Schedule to the Indian Medical Council Act
      - >-
        shall take upon himself and provide for the risk of any error which may
        subsequently be discovered and shall make no subsequent claim on account
        thereof.
      - Omissions and Descrepancies
  - source_sentence: Is there a way to claim consequential losses?
    sentences:
      - >-
        The Railway reserves the right of not to invite tenders for any of
        Railway work or works or to invite open or limited tenders
      - >-
        entitle the Contractor to damages or compensation therefor, but in any
        such case, the Railway may grant such extension or extensions of the
        completion date as may be considered reasonable.
      - "The Railway shall have the right to let other contracts in connection with the works. The Contractor shall afford other Contractors reasonable opportunity for the storage of their materials and the execution of their works and shall properly connect and coordinate his work with theirs. If any part of the Contractor\x92s work depends upon proper execution or result upon the work of another Contractor(s), the Contractor shall inspect and promptly report to the Engineer any defects in such works that render it unsuitable for such proper execution and results. The Contractor's failure so-to inspect and report shall constitute an acceptance of the other Contractor's work as fit and proper for the reception of his work, except as to defects which may develop in the other Contractor's work after the execution of his work."
  - source_sentence: Does the contract document contain a indemnification clause provision?
    sentences:
      - >-
        The partners of the firm to which the Letter of Acceptance (LOA) is
        issued, shall be jointly and severally liable to the Railway for
        execution of the contract
      - >-
        Contractor awarded the work shall submit a detailed program of work
        indicating the time schedule
      - >-
        All notices, communications, reference and complaints made by the
        Railway or the Engineer or the Engineer's Representative or the
        Contractor inter-se concerning the works shall be in writing or e-mail
        on registered e mail IDs and no notice, communication, reference or
        complaint not in writing or through e-mail, shall be recognized.
  - source_sentence: Force Majeure
    sentences:
      - >-
        These Regulations for Tenders and Contracts shall be read in conjunction
        with the Standard General Conditions of Contract which are referred to
        herein and shall be subject to modifications additions or suppression by
        Special Conditions of Contract and/or Special Specifications, if any,
        annexed to the Tender Forms.
      - Act of God
      - >-
        Instructions: The Engineer shall direct the order in which the several
        parts of the works shall be executed
  - source_sentence: Interpretation of Standard General Conditions of contract
    sentences:
      - >-
        The Contractor shall at his own expense provide himself with sheds,
        storehouses and yards in such situations and in such numbers as in the
        opinion of the Engineer is requisite for carrying on the works and the
        Contractor
      - >-
        What are the additional documents that have to be read along with the
        Standard General Conditions of Contract?
      - >-
        the necessity arises for the execution of such items of works that the
        accepted Schedule of Rates does not include rate or rates for the extra
        work involved.

SentenceTransformer based on BAAI/bge-large-en

This is a sentence-transformers model finetuned from BAAI/bge-large-en. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: BAAI/bge-large-en
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 1024 tokens
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("Ananthu357/Ananthus-BAAI-for-contracts7.0")
# Run inference
sentences = [
    'Interpretation of Standard General Conditions of contract',
    'What are the additional documents that have to be read along with the Standard General Conditions of Contract?',
    'The Contractor shall at his own expense provide himself with sheds, storehouses and yards in such situations and in such numbers as in the opinion of the Engineer is requisite for carrying on the works and the Contractor',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • num_train_epochs: 15
  • warmup_ratio: 0.1
  • fp16: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 15
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss loss
2.7778 100 0.0571 0.0611
5.5556 200 0.0073 0.0604
8.3333 300 0.0031 0.0578
11.1111 400 0.0019 0.0589
13.8889 500 0.0013 0.0587

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.0.1
  • Transformers: 4.42.4
  • PyTorch: 2.3.1+cu121
  • Accelerate: 0.32.1
  • Datasets: 2.21.0
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}