SentenceTransformer based on intfloat/multilingual-e5-large-instruct

This is a sentence-transformers model finetuned from intfloat/multilingual-e5-large-instruct on the mnlp_m3_rag_dataset dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: XLMRobertaModel 
  (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    "A builder had a contract to build a swimming pool for a residential customer. That customer's next door neighbor went to the builder and paid him extra to break the contract with the customer and instead to build a swimming pool on the neighbor's premises. The builder commenced building a swimming pool for the neighbor and breached his contract with the original customer. The original customer sued his neighbor in a tort claim for damages. Does the original customer have a valid claim against his neighbor?\nA. Yes, the neighbor committed the tort of interference with contract relations by intentionally interfering with an existing contract.\nB. No, people cannot be held in slavery\nC. they have the right to contract with whomever they please.\nD. No, the only remedy for the original customer is to sue the builder for breach of contract.\nE. Yes, the neighbor committed the tort of interference with prospective advantage.",
    'A tort is a civil wrong that causes harm or loss to another person, resulting in legal liability for the person who commits the tort. Tort law allows individuals to seek compensation for injuries or damages caused by the wrongful acts of others, distinct from breaches of contract.',
    'Substance use, such as alcohol and tobacco, during pregnancy can lead to various complications including low birth weight, developmental issues, and increased risk of infections, highlighting the importance of cessation and support for affected mothers.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Dataset

mnlp_m3_rag_dataset

  • Dataset: mnlp_m3_rag_dataset at e16d937
  • Size: 594,028 training samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 21 tokens
    • mean: 359.4 tokens
    • max: 512 tokens
    • min: 4 tokens
    • mean: 56.63 tokens
    • max: 433 tokens
  • Samples:
    anchor positive
    Little Lopsy fluttered into our home and our hearts one Saturday morning this summer. My husband went out to do something, and when he opened the door there was a great flutter on the ground and something came into the living room. It was clear that whatever it was was hurt. I was in a bit of a shock and didn't know what to do next. Fortunately it calmed down and tried to hide itself in a corner. I realized it was a sparrow chick . There are a few sparrow nests under the roof of our apartment, and this little fellow must have fallen out and hurt itself. It was also very young, and obviously far from ready to leave the safety of the nest. I ran to the place and found a box. Having read somewhere that one shouldn't touch a baby bird with one's hands, I picked the chick up with a hand towel and put it in the box. I placed the box outside the front door in the hope that the parents would try to feed it. They never came near it and I brought it inside. I placed the box on a table and it sl... Having read somewhere that one shouldn't touch a baby bird with one's hands, I picked the chick up with a hand towel and put it in the box.
    A thermal conductor is made of
    A. types of rubber
    B. types of wire
    C. electrodes
    D. that which conducts
    A thermal conductor is a material that allows heat to flow through it easily. Common examples of thermal conductors include metals such as copper and aluminum, known for their high thermal conductivity due to their free-flowing electrons. Heat transfer occurs via conduction when heat energy moves from the hotter part of a conductor to the cooler part, often described by Fourier's Law of heat conduction.
    A good example of increased demand may equal increased production is
    A. soldiers eat beans, so beans are planted when there is war
    B. dogs eat kibble, so stores sell it
    C. cats eat mice, so mice are afraid of cats
    D. people have babies, so baby clothes are made
    Supply is the total amount of a specific good or service that is available to consumers. Supply can relate to the amount available at a specific price or the amount available across a range of prices if displayed on a graph.
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Evaluation Dataset

mnlp_m3_rag_dataset

  • Dataset: mnlp_m3_rag_dataset at e16d937
  • Size: 5,920 evaluation samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 22 tokens
    • mean: 98.74 tokens
    • max: 512 tokens
    • min: 9 tokens
    • mean: 59.88 tokens
    • max: 501 tokens
  • Samples:
    anchor positive
    ക്രൂരകോഷ്ഠം ഉള്ള ഒരാളിൽ കോപിച്ചിരിക്കുന്ന ദോഷം താഴെപ്പറയുന്നവയിൽ ഏതാണ്?
    A. കഫം
    B. പിത്തം
    C. വാതം
    D. രക്തം
    ഓരോ ദോഷത്തിനും അതിന്റേതായ സ്വഭാവങ്ങളും ശരീരത്തിൽ അത് ഉണ്ടാക്കുന്ന ഫലങ്ങളും ഉണ്ട്.
    Melyik tényező nem befolyásolja a fagylalt keresleti függvényét?
    A. A fagylalt árának változása.
    B. Mindegyik tényező befolyásolja.
    C. A jégkrém árának változása.
    D. A fagylalttölcsér árának változása.
    A keresleti függvény negatív meredekségű, ami azt jelenti, hogy az ár növekedésével a keresett mennyiség csökken (csökkenő kereslet törvénye).
    In contrast to _______, _______ aim to reward favourable behaviour by companies. The success of such campaigns have been heightened through the use of ___________, which allow campaigns to facilitate the company in achieving _________ .
    A. Boycotts, Buyalls, Blockchain technology, Increased Sales
    B. Buycotts, Boycotts, Digital technology, Decreased Sales
    C. Boycotts, Buycotts, Digital technology, Decreased Sales
    D. Buycotts, Boycotts, Blockchain technology, Charitable donations
    E. Boycotts, Buyalls, Blockchain technology, Charitable donations
    F. Boycotts, Buycotts, Digital technology, Increased Sales
    G. Buycotts, Boycotts, Digital technology, Increased Sales
    H. Boycotts, Buycotts, Physical technology, Increased Sales
    I. Buycotts, Buyalls, Blockchain technology, Charitable donations
    J. Boycotts, Buycotts, Blockchain technology, Decreased Sales
    Consumer Activism: This term refers to the actions taken by consumers to promote social, political, or environmental causes. These actions can include boycotting certain companies or buycotting others, influencing market dynamics based on ethical considerations. The effectiveness of consumer activism can vary but has gained prominence in recent years with increased visibility through social media.
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • gradient_accumulation_steps: 2
  • learning_rate: 2e-05
  • warmup_steps: 5569
  • fp16: True
  • load_best_model_at_end: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 2
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 3
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 5569
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss Validation Loss
0.15 2785 0.2684 0.0411
0.3001 5570 0.1112 0.0541
0.4501 8355 0.1153 0.0633
0.6001 11140 0.1045 0.0582
0.7501 13925 0.0943 0.0606
0.9002 16710 0.0883 0.0563
1.0502 19495 0.0744 0.0505
1.2002 22280 0.0592 0.0523
1.3502 25065 0.059 0.0516
1.5002 27850 0.0544 0.0617
1.6503 30635 0.0521 0.0549
1.8003 33420 0.0502 0.0589
1.9503 36205 0.0449 0.0550
2.1003 38990 0.0369 0.0619
2.2503 41775 0.0331 0.0604
2.4004 44560 0.0308 0.0566
2.5504 47345 0.0294 0.0533
2.7004 50130 0.0286 0.0531
2.8504 52915 0.0266 0.0537
  • The bold row denotes the saved checkpoint. The training took 6h52m on a RTX5090

Framework Versions

  • Python: 3.12.3
  • Sentence Transformers: 4.1.0
  • Transformers: 4.52.4
  • PyTorch: 2.7.0+cu128
  • Accelerate: 1.7.0
  • Datasets: 3.6.0
  • Tokenizers: 0.21.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
6
Safetensors
Model size
560M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for DoDucAnh/MNLP_M3_document_encoder

Finetuned
(159)
this model

Dataset used to train DoDucAnh/MNLP_M3_document_encoder