--- base_model: BAAI/bge-base-en-v1.5 language: - en library_name: sentence-transformers license: apache-2.0 metrics: - cosine_accuracy@1 - cosine_accuracy@3 - cosine_accuracy@5 - cosine_accuracy@10 - cosine_precision@1 - cosine_precision@3 - cosine_precision@5 - cosine_precision@10 - cosine_recall@1 - cosine_recall@3 - cosine_recall@5 - cosine_recall@10 - cosine_ndcg@10 - cosine_mrr@10 - cosine_map@100 pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:6300 - loss:MatryoshkaLoss - loss:MultipleNegativesRankingLoss widget: - source_sentence: Total company-operated stores | 711 | | 655 sentences: - What type of financial documents are included in Part IV, Item 15(a)(1) of the Annual Report on Form 10-K? - What is the total number of company-operated stores as of January 28, 2024? - When does the 364-day facility entered into in August 2023 expire, and what is its total amount? - source_sentence: GM empowers employees to 'Speak Up for Safety' through the Employee Safety Concern Process which makes it easier for employees to report potential safety issues or suggest improvements without fear of retaliation and ensures their safety every day. sentences: - What item number is associated with financial statements and supplementary data in documents? - How does GM promote safety and well-being among its employees? - What are the main features included in the Skills for Jobs initiative launched by Microsoft? - source_sentence: Under the 2020 Plan, the exercise price of options granted is generally at least equal to the fair market value of the Company’s Class A common stock on the date of grant. sentences: - How is the exercise price for incentive stock options determined under Palantir Technologies Inc.’s 2020 Equity Incentive Plan? - What were the dividend amounts declared by AT&T for its preferred and common shares in December 2022 and December 2023? - What does Item 8 in a document usually represent? - source_sentence: On December 22, 2022, the parties entered into a settlement agreement to resolve the lawsuit, which provides for a payment of $725 million by us. The settlement was approved by the court on October 10, 2023, and the payment was made in November 2023. sentences: - What is the purpose of GM's collaboration efforts at their Global Technical Center in Warren, Michigan? - How does the acquisition method affect the financial statements after a business acquisition? - What was the outcome of the 2019 consumer class action regarding the company's user data practices? - source_sentence: Item 8, titled 'Financial Statements and Supplementary Data,' is followed by an index to these sections. sentences: - What section follows Item 8 in the document? - What is the total assets and shareholders' equity of Chubb Limited as of December 31, 2023? - How does AT&T emphasize diversity in its hiring practices? model-index: - name: BGE base Financial Matryoshka results: - task: type: information-retrieval name: Information Retrieval dataset: name: dim 768 type: dim_768 metrics: - type: cosine_accuracy@1 value: 0.7385714285714285 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.8642857142857143 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.8942857142857142 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.9342857142857143 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.7385714285714285 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.28809523809523807 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.17885714285714285 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09342857142857142 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.7385714285714285 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.8642857142857143 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.8942857142857142 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.9342857142857143 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.8387370920568787 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.8078395691609976 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.8102903092098301 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 512 type: dim_512 metrics: - type: cosine_accuracy@1 value: 0.7414285714285714 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.8557142857142858 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.8942857142857142 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.9328571428571428 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.7414285714285714 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.2852380952380953 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.17885714285714285 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09328571428571426 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.7414285714285714 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.8557142857142858 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.8942857142857142 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.9328571428571428 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.8380676321786823 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.8075895691609978 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.8101143502932845 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 256 type: dim_256 metrics: - type: cosine_accuracy@1 value: 0.7357142857142858 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.85 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.8814285714285715 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.92 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.7357142857142858 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.2833333333333333 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.17628571428571424 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09199999999999998 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.7357142857142858 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.85 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.8814285714285715 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.92 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.8286016704428653 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.7992942176870748 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.8028214002001232 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 128 type: dim_128 metrics: - type: cosine_accuracy@1 value: 0.7142857142857143 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.84 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.87 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.9128571428571428 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.7142857142857143 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.28 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.174 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.09128571428571428 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.7142857142857143 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.84 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.87 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.9128571428571428 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.8153680997284491 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.7840521541950115 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.7875962124214356 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: dim 64 type: dim_64 metrics: - type: cosine_accuracy@1 value: 0.6771428571428572 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.8085714285714286 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.8371428571428572 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.8857142857142857 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.6771428571428572 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.26952380952380955 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.1674285714285714 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.08857142857142855 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.6771428571428572 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.8085714285714286 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.8371428571428572 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.8857142857142857 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.7840147713456539 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.7513815192743762 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.755682487136274 name: Cosine Map@100 --- # BGE base Financial Matryoshka This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) on the json dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 768 tokens - **Similarity Function:** Cosine Similarity - **Training Dataset:** - json - **Language:** en - **License:** apache-2.0 ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("tessimago/bge-base-financial-matryoshka") # Run inference sentences = [ "Item 8, titled 'Financial Statements and Supplementary Data,' is followed by an index to these sections.", 'What section follows Item 8 in the document?', "What is the total assets and shareholders' equity of Chubb Limited as of December 31, 2023?", ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` ## Evaluation ### Metrics #### Information Retrieval * Dataset: `dim_768` * Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.7386 | | cosine_accuracy@3 | 0.8643 | | cosine_accuracy@5 | 0.8943 | | cosine_accuracy@10 | 0.9343 | | cosine_precision@1 | 0.7386 | | cosine_precision@3 | 0.2881 | | cosine_precision@5 | 0.1789 | | cosine_precision@10 | 0.0934 | | cosine_recall@1 | 0.7386 | | cosine_recall@3 | 0.8643 | | cosine_recall@5 | 0.8943 | | cosine_recall@10 | 0.9343 | | cosine_ndcg@10 | 0.8387 | | cosine_mrr@10 | 0.8078 | | **cosine_map@100** | **0.8103** | #### Information Retrieval * Dataset: `dim_512` * Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.7414 | | cosine_accuracy@3 | 0.8557 | | cosine_accuracy@5 | 0.8943 | | cosine_accuracy@10 | 0.9329 | | cosine_precision@1 | 0.7414 | | cosine_precision@3 | 0.2852 | | cosine_precision@5 | 0.1789 | | cosine_precision@10 | 0.0933 | | cosine_recall@1 | 0.7414 | | cosine_recall@3 | 0.8557 | | cosine_recall@5 | 0.8943 | | cosine_recall@10 | 0.9329 | | cosine_ndcg@10 | 0.8381 | | cosine_mrr@10 | 0.8076 | | **cosine_map@100** | **0.8101** | #### Information Retrieval * Dataset: `dim_256` * Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.7357 | | cosine_accuracy@3 | 0.85 | | cosine_accuracy@5 | 0.8814 | | cosine_accuracy@10 | 0.92 | | cosine_precision@1 | 0.7357 | | cosine_precision@3 | 0.2833 | | cosine_precision@5 | 0.1763 | | cosine_precision@10 | 0.092 | | cosine_recall@1 | 0.7357 | | cosine_recall@3 | 0.85 | | cosine_recall@5 | 0.8814 | | cosine_recall@10 | 0.92 | | cosine_ndcg@10 | 0.8286 | | cosine_mrr@10 | 0.7993 | | **cosine_map@100** | **0.8028** | #### Information Retrieval * Dataset: `dim_128` * Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.7143 | | cosine_accuracy@3 | 0.84 | | cosine_accuracy@5 | 0.87 | | cosine_accuracy@10 | 0.9129 | | cosine_precision@1 | 0.7143 | | cosine_precision@3 | 0.28 | | cosine_precision@5 | 0.174 | | cosine_precision@10 | 0.0913 | | cosine_recall@1 | 0.7143 | | cosine_recall@3 | 0.84 | | cosine_recall@5 | 0.87 | | cosine_recall@10 | 0.9129 | | cosine_ndcg@10 | 0.8154 | | cosine_mrr@10 | 0.7841 | | **cosine_map@100** | **0.7876** | #### Information Retrieval * Dataset: `dim_64` * Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.6771 | | cosine_accuracy@3 | 0.8086 | | cosine_accuracy@5 | 0.8371 | | cosine_accuracy@10 | 0.8857 | | cosine_precision@1 | 0.6771 | | cosine_precision@3 | 0.2695 | | cosine_precision@5 | 0.1674 | | cosine_precision@10 | 0.0886 | | cosine_recall@1 | 0.6771 | | cosine_recall@3 | 0.8086 | | cosine_recall@5 | 0.8371 | | cosine_recall@10 | 0.8857 | | cosine_ndcg@10 | 0.784 | | cosine_mrr@10 | 0.7514 | | **cosine_map@100** | **0.7557** | ## Training Details ### Training Dataset #### json * Dataset: json * Size: 6,300 training samples * Columns: positive and anchor * Approximate statistics based on the first 1000 samples: | | positive | anchor | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | positive | anchor | |:----------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------| | As of January 28, 2024, we held cash and cash equivalents of $2.2 billion. | What was the total cash and cash equivalents held by the company as of January 28, 2024? | | Net cash used in financing activities amounted to $1,600 million in fiscal year 2023. | What was the total net cash used in financing activities in fiscal year 2023? | | Item 8, titled 'Financial Statements and Supplementary Data,' is followed by an index to these sections. | What section follows Item 8 in the document? | * Loss: [MatryoshkaLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters: ```json { "loss": "MultipleNegativesRankingLoss", "matryoshka_dims": [ 768, 512, 256, 128, 64 ], "matryoshka_weights": [ 1, 1, 1, 1, 1 ], "n_dims_per_step": -1 } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: epoch - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 16 - `gradient_accumulation_steps`: 16 - `learning_rate`: 2e-05 - `num_train_epochs`: 4 - `lr_scheduler_type`: cosine - `warmup_ratio`: 0.1 - `bf16`: True - `tf32`: True - `load_best_model_at_end`: True - `optim`: adamw_torch_fused - `batch_sampler`: no_duplicates #### All Hyperparameters
Click to expand - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: epoch - `prediction_loss_only`: True - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 16 - `eval_accumulation_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 4 - `max_steps`: -1 - `lr_scheduler_type`: cosine - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: True - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: True - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: True - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch_fused - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional
### Training Logs | Epoch | Step | Training Loss | dim_128_cosine_map@100 | dim_256_cosine_map@100 | dim_512_cosine_map@100 | dim_64_cosine_map@100 | dim_768_cosine_map@100 | |:----------:|:------:|:-------------:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|:----------------------:| | 0.8122 | 10 | 1.5849 | - | - | - | - | - | | 0.9746 | 12 | - | 0.7610 | 0.7799 | 0.7878 | 0.7254 | 0.7922 | | 1.6244 | 20 | 0.6368 | - | - | - | - | - | | 1.9492 | 24 | - | 0.7823 | 0.7974 | 0.8047 | 0.7515 | 0.8046 | | 2.4365 | 30 | 0.4976 | - | - | - | - | - | | **2.9239** | **36** | **-** | **0.7876** | **0.803** | **0.8096** | **0.754** | **0.8081** | | 3.2487 | 40 | 0.3845 | - | - | - | - | - | | 3.8985 | 48 | - | 0.7876 | 0.8028 | 0.8101 | 0.7557 | 0.8103 | * The bold row denotes the saved checkpoint. ### Framework Versions - Python: 3.10.14 - Sentence Transformers: 3.1.0 - Transformers: 4.41.2 - PyTorch: 2.1.2+cu121 - Accelerate: 0.34.2 - Datasets: 2.19.1 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MatryoshkaLoss ```bibtex @misc{kusupati2024matryoshka, title={Matryoshka Representation Learning}, author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi}, year={2024}, eprint={2205.13147}, archivePrefix={arXiv}, primaryClass={cs.LG} } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```