SentenceTransformer based on BAAI/bge-m3

This is a sentence-transformers model finetuned from BAAI/bge-m3. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: BAAI/bge-m3
  • Maximum Sequence Length: 8192 tokens
  • Output Dimensionality: 1024 tokens
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: XLMRobertaModel 
  (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("justmeomeo1/anime_query_embedding_model")
# Run inference
sentences = [
    "The Eto Rangers ride in Space-Time Transmitting Machine Kirinda to repair the Novel Worlds of Mugen. The Eto Rangers themselves are anthropomorphic animals, each representing one of the 12 Chinese zodiac animals (and The Twelve Branches in Buddhism). In Japan they are known as the Eto animals. The Novel Worlds are stories created by the human imagination, such as old folk tales as well as newer books. These living worlds play out repeatedly, and are necessary for the good of humanity. Princess Aura rules Mugen, which is an island continent hanging over an ocean from the skybound Novel Pole. The Great God Goal gives her power, but she may never leave the small palace island area. Their nemesis is the forgotten Spirit of Cats, Nyanma (real name, Chocolat), who was excluded from the choice of 12 protectors by being disqualified from the race (in actual legend there are many versions of this story). She seeks her revenge by distorting Novel Worlds with Jarei Monsters.When a Jarei Monster goes to a Novel World they alter the story, turning it into a different version, sometimes a parody of itself. Bakumaru, the Spirit of Mice, must use the Genmakyou mirror to reveal (Jarei Shouran!) the evil spirit once enough clues lead to its identity. Often the Eto Rangers must play out some of the story to find out who or what it is, sometimes even taking on the role of one of the characters in the tale. After defeating the creature, Kirinda is called when Bakumaru holds up his hand with one of Aura's 12 gems on it and calls out Daikourin Kirinda! Kirinda descends from a dimensional slit and purifies the evil spirit with a beam weapon, calling out Jouka! (Purification).(Source: Wikipedia)",
    'Eto Rangers, Novel Worlds, Mugen adventure',
    'Olympic Paralympic animated promotional styles',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Dataset

Unnamed Dataset

  • Size: 2,000 training samples
  • Columns: sentence_0 and sentence_1
  • Approximate statistics based on the first 1000 samples:
    sentence_0 sentence_1
    type string string
    details
    • min: 27 tokens
    • mean: 136.75 tokens
    • max: 440 tokens
    • min: 6 tokens
    • mean: 12.38 tokens
    • max: 43 tokens
  • Samples:
    sentence_0 sentence_1
    Music video for the song Kingyo no John by Hotaru Light Hill's Band that was featured on NHK's Minna no Uta program. Hotaru Light Hill's Band Kingyo no John NHK Minna no Uta music video
    Prologue to the second game in the Generation of Chaos RPG series. Ellile is a knight-in-training in the Kingdom of Fredbarn in the Neverland World. He and Princess Roji are secretly in love, and plan to marry. Lifile, the head of the knights, finds out, and tells Ellile that he's not strong enough to protect Roji, let alone the Kingdom. Ellile becomes frustrated, and comtemplates what to do... During the OVA, there are a few animated shorts dealing with a fourth character, Poro, her robot friend, and the GOC Next cast.(Source: ANN) Generation of Chaos RPG series, knight-in-training romance
    Kouji wants to work part-time, so, when he saw a cute girl with the Pia Carrot uniform, he tried to get a job there. When he's on the way, he accidentally bumped Azusa, and with some misunderstanding, they literally hate each other. What is the surprise of Kouji when he finds that she also got a job in the Pia Carrot, facing her daily. But as time passes, he will discover that 'hate' is not the appropriate word to describe his feelings...(Source: ANN) Pia Carrot uniform part-time job romantic comedy
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 1
  • per_device_eval_batch_size: 1
  • num_train_epochs: 2
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 1
  • per_device_eval_batch_size: 1
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 2
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin

Training Logs

Epoch Step Training Loss
0.25 500 0.0
0.5 1000 0.0
0.75 1500 0.0
1.0 2000 0.0
1.25 2500 0.0
1.5 3000 0.0
1.75 3500 0.0
2.0 4000 0.0

Framework Versions

  • Python: 3.10.19
  • Sentence Transformers: 3.0.1
  • Transformers: 4.41.2
  • PyTorch: 2.9.1+cu126
  • Accelerate: 0.31.0
  • Datasets: 2.20.0
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply}, 
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
15
Safetensors
Model size
0.6B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for justmeomeo1/finetuned_model_for_anime_query

Base model

BAAI/bge-m3
Finetuned
(350)
this model