|
---
|
|
tags:
|
|
- sentence-transformers
|
|
- sentence-similarity
|
|
- feature-extraction
|
|
- generated_from_trainer
|
|
- dataset_size:182343
|
|
- loss:CategoricalContrastiveLoss
|
|
widget:
|
|
- source_sentence: 科目:コンクリート。名称:浮き床コンクリート。
|
|
sentences:
|
|
- 科目:コンクリート。名称:普通コンクリート。摘要:JIS A5308 FC21 S18粗骨材20。備考:刊-コン 2118EXP.J面コン。
|
|
- 科目:コンクリート。名称:充填コンクリート(EXP_J内)。
|
|
- 科目:コンクリート。名称:免震上部コンクリート打設手間。
|
|
- source_sentence: 科目:コンクリート。名称:基礎部コンクリート打設手間。
|
|
sentences:
|
|
- 科目:コンクリート。名称:基礎部高流動コンクリート。摘要:FC36N/mm2 スランプフロー55~65高性能AE減水剤。備考:代価表 0059。
|
|
- 科目:コンクリート。名称:基礎部コンクリート。摘要:FC24N/mm2 スランプ15。備考:代価表 0040。
|
|
- 科目:コンクリート。名称:コンクリート(個別)。摘要:F0=18N/mm2 S=18 徳島1。備考:B1-111111 H2906BD 個別嵩上げコンクリート。
|
|
- source_sentence: 科目:コンクリート。名称:浮き床コンクリート。
|
|
sentences:
|
|
- 科目:コンクリート。名称:コンクリートポンプ圧送。摘要:100m3/回以上基本料金別途加算。備考:B0-434226 No.1 市場地上部コン(5F)。
|
|
- 科目:コンクリート。名称:普通コンクリート。摘要:JIS A5308 FC48 フロー60粗骨材20高性能AE減水剤。備考:刊-コン 4860K免震装置下部コン。
|
|
- 科目:コンクリート。名称:防水押えコンクリート。
|
|
- source_sentence: 科目:コンクリート。名称:基礎コンクリート。
|
|
sentences:
|
|
- 科目:コンクリート。名称:均しコンクリート。
|
|
- 科目:コンクリート。名称:普通コンクリート。摘要:FC=24 S18粗骨材地上部。備考:代価表 0060。
|
|
- 科目:コンクリート。名称:基礎部高流動コンクリート。摘要:FC36N/mm2 スランプフロー55~65高性能AE減水剤。備考:代価表 0059。
|
|
- source_sentence: 科目:コンクリート。名称:コンクリート打設手間。
|
|
sentences:
|
|
- 科目:コンクリート。名称:免震上部コンクリート打設手間。
|
|
- 科目:コンクリート。名称:普通コンクリート。摘要:JIS A5308 FC=18 S15粗骨材20。備考:B0-114112 H22.11 協議防水保護コンクリート。
|
|
- 科目:コンクリート。名称:コンクリート打設手間。
|
|
pipeline_tag: sentence-similarity
|
|
library_name: sentence-transformers
|
|
---
|
|
|
|
# SentenceTransformer
|
|
|
|
This is a [sentence-transformers](https://www.SBERT.net) model trained. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
|
|
|
|
## Model Details
|
|
|
|
### Model Description
|
|
- **Model Type:** Sentence Transformer
|
|
<!-- - **Base model:** [Unknown](https://huggingface.co/unknown) -->
|
|
- **Maximum Sequence Length:** 512 tokens
|
|
- **Output Dimensionality:** 768 dimensions
|
|
- **Similarity Function:** Cosine Similarity
|
|
<!-- - **Training Dataset:** Unknown -->
|
|
<!-- - **Language:** Unknown -->
|
|
<!-- - **License:** Unknown -->
|
|
|
|
### Model Sources
|
|
|
|
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
|
|
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
|
|
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
|
|
|
|
### Full Model Architecture
|
|
|
|
```
|
|
SentenceTransformer(
|
|
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
|
|
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
|
|
)
|
|
```
|
|
|
|
## Usage
|
|
|
|
### Direct Usage (Sentence Transformers)
|
|
|
|
First install the Sentence Transformers library:
|
|
|
|
```bash
|
|
pip install -U sentence-transformers
|
|
```
|
|
|
|
Then you can load this model and run inference.
|
|
```python
|
|
from sentence_transformers import SentenceTransformer
|
|
|
|
# Download from the 🤗 Hub
|
|
model = SentenceTransformer("Detomo/cl-nagoya-sup-simcse-ja-nss-v1_0_7_8")
|
|
# Run inference
|
|
sentences = [
|
|
'科目:コンクリート。名称:コンクリート打設手間。',
|
|
'科目:コンクリート。名称:免震上部コンクリート打設手間。',
|
|
'科目:コンクリート。名称:普通コンクリート。摘要:JIS A5308 FC=18 S15粗骨材20。備考:B0-114112 H22.11 協議防水保護コンクリート。',
|
|
]
|
|
embeddings = model.encode(sentences)
|
|
print(embeddings.shape)
|
|
# [3, 768]
|
|
|
|
# Get the similarity scores for the embeddings
|
|
similarities = model.similarity(embeddings, embeddings)
|
|
print(similarities.shape)
|
|
# [3, 3]
|
|
```
|
|
|
|
<!--
|
|
### Direct Usage (Transformers)
|
|
|
|
<details><summary>Click to see the direct usage in Transformers</summary>
|
|
|
|
</details>
|
|
-->
|
|
|
|
<!--
|
|
### Downstream Usage (Sentence Transformers)
|
|
|
|
You can finetune this model on your own dataset.
|
|
|
|
<details><summary>Click to expand</summary>
|
|
|
|
</details>
|
|
-->
|
|
|
|
<!--
|
|
### Out-of-Scope Use
|
|
|
|
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
|
|
-->
|
|
|
|
<!--
|
|
## Bias, Risks and Limitations
|
|
|
|
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
|
|
-->
|
|
|
|
<!--
|
|
### Recommendations
|
|
|
|
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
|
|
-->
|
|
|
|
## Training Details
|
|
|
|
### Training Dataset
|
|
|
|
#### Unnamed Dataset
|
|
|
|
* Size: 182,343 training samples
|
|
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code>
|
|
* Approximate statistics based on the first 1000 samples:
|
|
| | sentence1 | sentence2 | label |
|
|
|:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------|
|
|
| type | string | string | int |
|
|
| details | <ul><li>min: 11 tokens</li><li>mean: 13.32 tokens</li><li>max: 19 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 34.8 tokens</li><li>max: 72 tokens</li></ul> | <ul><li>0: ~68.50%</li><li>1: ~4.50%</li><li>2: ~27.00%</li></ul> |
|
|
* Samples:
|
|
| sentence1 | sentence2 | label |
|
|
|:-----------------------------------------|:-------------------------------------------------------------------------------------------------|:---------------|
|
|
| <code>科目:コンクリート。名称:コンクリートポンプ圧送。</code> | <code>科目:コンクリート。名称:ポンプ圧送。</code> | <code>1</code> |
|
|
| <code>科目:コンクリート。名称:コンクリートポンプ圧送。</code> | <code>科目:コンクリート。名称:コンクリートポンプ圧送。摘要:100m3/回以上基本料金別途加算。備考:B0-434226 No.1 市場免震層下部コン。</code> | <code>2</code> |
|
|
| <code>科目:コンクリート。名称:コンクリートポンプ圧送。</code> | <code>科目:コンクリート。名称:コンクリートポンプ圧送。摘要:100m3/回以上基本料金別途加算。備考:B0-434226 No.1 市場基礎部マスコン。</code> | <code>2</code> |
|
|
* Loss: <code>sentence_transformer_lib.categorical_constrastive_loss.CategoricalContrastiveLoss</code>
|
|
|
|
### Training Hyperparameters
|
|
#### Non-Default Hyperparameters
|
|
|
|
- `per_device_train_batch_size`: 256
|
|
- `per_device_eval_batch_size`: 256
|
|
- `learning_rate`: 1e-05
|
|
- `weight_decay`: 0.01
|
|
- `num_train_epochs`: 10
|
|
- `warmup_ratio`: 0.2
|
|
- `fp16`: True
|
|
|
|
#### All Hyperparameters
|
|
<details><summary>Click to expand</summary>
|
|
|
|
- `overwrite_output_dir`: False
|
|
- `do_predict`: False
|
|
- `eval_strategy`: no
|
|
- `prediction_loss_only`: True
|
|
- `per_device_train_batch_size`: 256
|
|
- `per_device_eval_batch_size`: 256
|
|
- `per_gpu_train_batch_size`: None
|
|
- `per_gpu_eval_batch_size`: None
|
|
- `gradient_accumulation_steps`: 1
|
|
- `eval_accumulation_steps`: None
|
|
- `torch_empty_cache_steps`: None
|
|
- `learning_rate`: 1e-05
|
|
- `weight_decay`: 0.01
|
|
- `adam_beta1`: 0.9
|
|
- `adam_beta2`: 0.999
|
|
- `adam_epsilon`: 1e-08
|
|
- `max_grad_norm`: 1.0
|
|
- `num_train_epochs`: 10
|
|
- `max_steps`: -1
|
|
- `lr_scheduler_type`: linear
|
|
- `lr_scheduler_kwargs`: {}
|
|
- `warmup_ratio`: 0.2
|
|
- `warmup_steps`: 0
|
|
- `log_level`: passive
|
|
- `log_level_replica`: warning
|
|
- `log_on_each_node`: True
|
|
- `logging_nan_inf_filter`: True
|
|
- `save_safetensors`: True
|
|
- `save_on_each_node`: False
|
|
- `save_only_model`: False
|
|
- `restore_callback_states_from_checkpoint`: False
|
|
- `no_cuda`: False
|
|
- `use_cpu`: False
|
|
- `use_mps_device`: False
|
|
- `seed`: 42
|
|
- `data_seed`: None
|
|
- `jit_mode_eval`: False
|
|
- `use_ipex`: False
|
|
- `bf16`: False
|
|
- `fp16`: True
|
|
- `fp16_opt_level`: O1
|
|
- `half_precision_backend`: auto
|
|
- `bf16_full_eval`: False
|
|
- `fp16_full_eval`: False
|
|
- `tf32`: None
|
|
- `local_rank`: 0
|
|
- `ddp_backend`: None
|
|
- `tpu_num_cores`: None
|
|
- `tpu_metrics_debug`: False
|
|
- `debug`: []
|
|
- `dataloader_drop_last`: False
|
|
- `dataloader_num_workers`: 0
|
|
- `dataloader_prefetch_factor`: None
|
|
- `past_index`: -1
|
|
- `disable_tqdm`: False
|
|
- `remove_unused_columns`: True
|
|
- `label_names`: None
|
|
- `load_best_model_at_end`: False
|
|
- `ignore_data_skip`: False
|
|
- `fsdp`: []
|
|
- `fsdp_min_num_params`: 0
|
|
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
|
|
- `tp_size`: 0
|
|
- `fsdp_transformer_layer_cls_to_wrap`: None
|
|
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
|
|
- `deepspeed`: None
|
|
- `label_smoothing_factor`: 0.0
|
|
- `optim`: adamw_torch
|
|
- `optim_args`: None
|
|
- `adafactor`: False
|
|
- `group_by_length`: False
|
|
- `length_column_name`: length
|
|
- `ddp_find_unused_parameters`: None
|
|
- `ddp_bucket_cap_mb`: None
|
|
- `ddp_broadcast_buffers`: False
|
|
- `dataloader_pin_memory`: True
|
|
- `dataloader_persistent_workers`: False
|
|
- `skip_memory_metrics`: True
|
|
- `use_legacy_prediction_loop`: False
|
|
- `push_to_hub`: False
|
|
- `resume_from_checkpoint`: None
|
|
- `hub_model_id`: None
|
|
- `hub_strategy`: every_save
|
|
- `hub_private_repo`: None
|
|
- `hub_always_push`: False
|
|
- `gradient_checkpointing`: False
|
|
- `gradient_checkpointing_kwargs`: None
|
|
- `include_inputs_for_metrics`: False
|
|
- `include_for_metrics`: []
|
|
- `eval_do_concat_batches`: True
|
|
- `fp16_backend`: auto
|
|
- `push_to_hub_model_id`: None
|
|
- `push_to_hub_organization`: None
|
|
- `mp_parameters`:
|
|
- `auto_find_batch_size`: False
|
|
- `full_determinism`: False
|
|
- `torchdynamo`: None
|
|
- `ray_scope`: last
|
|
- `ddp_timeout`: 1800
|
|
- `torch_compile`: False
|
|
- `torch_compile_backend`: None
|
|
- `torch_compile_mode`: None
|
|
- `include_tokens_per_second`: False
|
|
- `include_num_input_tokens_seen`: False
|
|
- `neftune_noise_alpha`: None
|
|
- `optim_target_modules`: None
|
|
- `batch_eval_metrics`: False
|
|
- `eval_on_start`: False
|
|
- `use_liger_kernel`: False
|
|
- `eval_use_gather_object`: False
|
|
- `average_tokens_across_devices`: False
|
|
- `prompts`: None
|
|
- `batch_sampler`: batch_sampler
|
|
- `multi_dataset_batch_sampler`: proportional
|
|
|
|
</details>
|
|
|
|
### Training Logs
|
|
| Epoch | Step | Training Loss |
|
|
|:------:|:----:|:-------------:|
|
|
| 0.0701 | 50 | 0.2825 |
|
|
| 0.1403 | 100 | 0.1467 |
|
|
| 0.2104 | 150 | 0.0947 |
|
|
| 0.2805 | 200 | 0.0839 |
|
|
| 0.3506 | 250 | 0.0769 |
|
|
| 0.4208 | 300 | 0.0684 |
|
|
| 0.4909 | 350 | 0.0625 |
|
|
| 0.5610 | 400 | 0.0582 |
|
|
| 0.6311 | 450 | 0.0579 |
|
|
| 0.7013 | 500 | 0.0514 |
|
|
| 0.7714 | 550 | 0.0514 |
|
|
| 0.8415 | 600 | 0.0448 |
|
|
| 0.9116 | 650 | 0.0436 |
|
|
| 0.9818 | 700 | 0.0422 |
|
|
| 1.0519 | 750 | 0.0371 |
|
|
| 1.1220 | 800 | 0.0377 |
|
|
| 1.1921 | 850 | 0.0353 |
|
|
| 1.2623 | 900 | 0.0354 |
|
|
| 1.3324 | 950 | 0.0325 |
|
|
| 1.4025 | 1000 | 0.0328 |
|
|
| 1.4727 | 1050 | 0.0302 |
|
|
| 1.5428 | 1100 | 0.0259 |
|
|
| 1.6129 | 1150 | 0.0267 |
|
|
| 1.6830 | 1200 | 0.0274 |
|
|
| 1.7532 | 1250 | 0.0262 |
|
|
| 1.8233 | 1300 | 0.0234 |
|
|
| 1.8934 | 1350 | 0.0244 |
|
|
| 1.9635 | 1400 | 0.0238 |
|
|
| 2.0337 | 1450 | 0.02 |
|
|
| 2.1038 | 1500 | 0.0187 |
|
|
| 2.1739 | 1550 | 0.0185 |
|
|
| 2.2440 | 1600 | 0.0178 |
|
|
| 2.3142 | 1650 | 0.016 |
|
|
| 2.3843 | 1700 | 0.0169 |
|
|
| 2.4544 | 1750 | 0.0171 |
|
|
| 2.5245 | 1800 | 0.0146 |
|
|
| 2.5947 | 1850 | 0.0145 |
|
|
| 2.6648 | 1900 | 0.0146 |
|
|
| 2.7349 | 1950 | 0.0139 |
|
|
| 2.8050 | 2000 | 0.0119 |
|
|
| 2.8752 | 2050 | 0.0131 |
|
|
| 2.9453 | 2100 | 0.0124 |
|
|
| 3.0154 | 2150 | 0.011 |
|
|
| 3.0856 | 2200 | 0.0109 |
|
|
| 3.1557 | 2250 | 0.0103 |
|
|
| 3.2258 | 2300 | 0.0102 |
|
|
| 3.2959 | 2350 | 0.0089 |
|
|
| 3.3661 | 2400 | 0.0083 |
|
|
| 3.4362 | 2450 | 0.0095 |
|
|
| 3.5063 | 2500 | 0.0085 |
|
|
| 3.5764 | 2550 | 0.009 |
|
|
| 3.6466 | 2600 | 0.0083 |
|
|
| 3.7167 | 2650 | 0.0093 |
|
|
| 3.7868 | 2700 | 0.0084 |
|
|
| 3.8569 | 2750 | 0.0084 |
|
|
| 3.9271 | 2800 | 0.0088 |
|
|
| 3.9972 | 2850 | 0.0086 |
|
|
| 4.0673 | 2900 | 0.0057 |
|
|
| 4.1374 | 2950 | 0.0078 |
|
|
| 4.2076 | 3000 | 0.0062 |
|
|
| 4.2777 | 3050 | 0.0066 |
|
|
| 4.3478 | 3100 | 0.006 |
|
|
| 4.4180 | 3150 | 0.0078 |
|
|
| 4.4881 | 3200 | 0.0056 |
|
|
| 4.5582 | 3250 | 0.0064 |
|
|
| 4.6283 | 3300 | 0.0063 |
|
|
| 4.6985 | 3350 | 0.0058 |
|
|
| 4.7686 | 3400 | 0.005 |
|
|
| 4.8387 | 3450 | 0.0057 |
|
|
| 4.9088 | 3500 | 0.0059 |
|
|
| 4.9790 | 3550 | 0.0063 |
|
|
| 5.0491 | 3600 | 0.0046 |
|
|
| 5.1192 | 3650 | 0.0041 |
|
|
| 5.1893 | 3700 | 0.005 |
|
|
| 5.2595 | 3750 | 0.0043 |
|
|
| 5.3296 | 3800 | 0.0046 |
|
|
| 5.3997 | 3850 | 0.0041 |
|
|
| 5.4698 | 3900 | 0.006 |
|
|
| 5.5400 | 3950 | 0.0052 |
|
|
| 5.6101 | 4000 | 0.0043 |
|
|
|
|
|
|
### Framework Versions
|
|
- Python: 3.11.12
|
|
- Sentence Transformers: 4.1.0
|
|
- Transformers: 4.51.3
|
|
- PyTorch: 2.6.0+cu124
|
|
- Accelerate: 1.6.0
|
|
- Datasets: 2.14.4
|
|
- Tokenizers: 0.21.1
|
|
|
|
## Citation
|
|
|
|
### BibTeX
|
|
|
|
#### Sentence Transformers
|
|
```bibtex
|
|
@inproceedings{reimers-2019-sentence-bert,
|
|
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
|
|
author = "Reimers, Nils and Gurevych, Iryna",
|
|
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
|
|
month = "11",
|
|
year = "2019",
|
|
publisher = "Association for Computational Linguistics",
|
|
url = "https://arxiv.org/abs/1908.10084",
|
|
}
|
|
```
|
|
|
|
<!--
|
|
## Glossary
|
|
|
|
*Clearly define terms in order to be accessible across audiences.*
|
|
-->
|
|
|
|
<!--
|
|
## Model Card Authors
|
|
|
|
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
|
|
-->
|
|
|
|
<!--
|
|
## Model Card Contact
|
|
|
|
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
|
|
--> |