sucharush's picture
Add new SentenceTransformer model
4efe6e2 verified
metadata
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:98112
  - loss:MultipleNegativesRankingLoss
base_model: thenlper/gte-small
widget:
  - source_sentence: How does a photocell control outdoor lighting?
    sentences:
      - >-
        To solve this problem, we can use the binomial probability formula:


        P(X = k) = C(n, k) * p^k * (1-p)^(n-k)


        where:

        - P(X = k) is the probability of exactly k successes (faulty keyboards)
        in n trials (laptops produced)

        - C(n, k) is the number of combinations of n items taken k at a time (n!
        / (k!(n-k)!))

        - p is the probability of success (5% or 0.05)

        - n is the number of trials (400 laptops)

        - k is the number of successes (20 faulty keyboards)


        However, we want to find the probability of at least 20 faulty
        keyboards, so we need to find the sum of probabilities for k = 20, 21,
        22, ..., 400.


        P(X >= 20) = 1 - P(X < 20) = 1 - Σ P(X = k) for k = 0 to 19


        Now, we can calculate the probabilities for each value of k and sum them
        up:


        P(X >= 20) = 1 - Σ C(400, k) * 0.05^k * 0.95^(400-k) for k = 0 to 19


        Using a calculator or software to compute the sum, we get:


        P(X >= 20)  1 - 0.0184 = 0.9816


        So, the probability that at least 20 laptops will have a faulty keyboard
        is approximately 98.16%.
      - >-
        A photocell controls outdoor lighting by detecting the level of ambient
        light. It automatically turns the lights on when it becomes dark and off
        when it becomes light, functioning as a light-dependent switch for
        energy efficiency and convenience.
      - >-
        Glycosylation with β-N-acetylglucosamine (O-GlcNAcylation) is one of the
        most complex post-translational modifications. The cycling of O-GlcNAc
        is controlled by two enzymes: UDP-NAc transferase (OGT) and O-GlcNAcase
        (OGA). We recently reported that endothelin-1 (ET-1) augments vascular
        levels of O-GlcNAcylated proteins. Here we tested the hypothesis that
        O-GlcNAcylation contributes to the vascular effects of ET-1 via
        activation of the RhoA/Rho-kinase pathway. Incubation of vascular smooth
        muscle cells (VSMCs) with ET-1 (0.1 μM) produces a time-dependent
        increase in O-GlcNAc levels. ET-1-induced O-GlcNAcylation is not
        observed when VSMCs are previously transfected with OGT siRNA, treated
        with ST045849 (OGT inhibitor) or atrasentan (ET(A) antagonist). ET-1 as
        well as PugNAc (OGA inhibitor) augmented contractions to phenylephrine
        in endothelium-denuded rat aortas, an effect that was abolished by the
        Rho kinase inhibitor Y-27632. Incubation of VSMCs with ET-1 increased
        expression of the phosphorylated forms of myosin phosphatase target
        subunit 1 (MYPT-1), protein kinase C-potentiated protein phosphatase 1
        inhibitor protein (protein kinase C-potentiated phosphatase
        inhibitor-17), and myosin light chain (MLC) and RhoA expression and
        activity, and this effect was abolished by both OGT siRNA transfection
        or OGT inhibition and atrasentan. ET-1 also augmented expression of
        PDZ-Rho GEF (guanine nucleotide exchange factor) and p115-Rho GEF in
        VSMCs and this was prevented by OGT siRNA, ST045849, and atrasentan.
  - source_sentence: >-
      A torus has a major radius of 5 cm and a minor radius of 3 cm. Find the
      volume of the torus.
    sentences:
      - >-
        To find the Hausdorff dimension of the Koch curve, we can use the
        formula:


        Hausdorff dimension (D) = log(N) / log(1/s)


        where N is the number of self-similar pieces and s is the scaling
        factor.


        For the Koch curve, each line segment is divided into four segments,
        each of which is 1/3 the length of the original segment. Therefore, N =
        4 and s = 1/3.


        Now, we can plug these values into the formula:


        D = log(4) / log(1/3)


        D  1.2619


        So, the Hausdorff dimension of the Koch curve is approximately 1.2619.
      - >-
        To find the volume of a torus, we can use the formula:


        Volume =  * minor_radius^2) * (2 * π * major_radius)


        where minor_radius is the minor radius of the torus and major_radius is
        the major radius of the torus.


        Given that the major radius is 5 cm and the minor radius is 3 cm, we can
        plug these values into the formula:


        Volume =  * 3^2) * (2 * π * 5)


        Volume =  * 9) * (10 * π)


        Volume = 90 * π^2


        The volume of the torus is approximately 282.74 cubic centimeters.
      - >-
        The purpose of the present study was to elucidate the mechanisms of
        action mediating enhancement of basal glucose uptake in skeletal muscle
        cells by seven medicinal plant products recently identified from the
        pharmacopeia of native Canadian populations (Spoor et al., 2006).
        Activity of the major signaling pathways that regulate glucose uptake
        was assessed by western immunoblot in C2C12 muscle cells treated with
        extracts from these plant species. Effects of extracts on mitochondrial
        function were assessed by respirometry in isolated rat liver
        mitochondria. Metabolic stress induced by extracts was assessed by
        measuring ATP concentration and rate of cell medium acidification in
        C2C12 myotubes and H4IIE hepatocytes. Extracts were applied at a dose of
        15-100 microg/ml. The effect of all seven products was achieved through
        a common mechanism mediated not by the insulin signaling pathway but
        rather by the AMP-activated protein kinase (AMPK) pathway in response to
        the disruption of mitochondrial function and ensuing metabolic stress.
        Disruption of mitochondrial function occurred in the form of uncoupling
        of oxidative phosphorylation and/or inhibition of ATPsynthase. Activity
        of the AMPK pathway, in some instances comparable to that stimulated by
        4mM of the AMP-mimetic AICAR, was in several cases sustained for at
        least 18h post-treatment. Duration of metabolic stress, however, was in
        most cases in the order of 1h.
  - source_sentence: >-
      Consider the elliptic curve given by the equation $y^2=x^3-2x+5$ over the
      field of rational numbers $\mathbb{Q}$. Let $P=(1,2)$ and $Q=(-1,2)$ be
      two points on the curve. Find the equation of the line passing through $P$
      and $Q$ and show that it intersects the curve at another point $R$. Then,
      find the coordinates of the point $R$.
    sentences:
      - >-
        Fifteen novel derivatives of D-DIBOA, including aromatic ring
        modifications and the addition of side chains in positions C-2 and N-4,
        had previously been synthesised and their phytotoxicity on standard
        target species (STS) evaluated. This strategy combined steric,
        electronic, solubility and lipophilicity requirements to achieve the
        maximum phytotoxic activity. An evaluation of the bioactivity of these
        compounds on the systems Oryza sativa-Echinochloa crus-galli and
        Triticum aestivum-Avena fatua is reported here. All compounds showed
        inhibition profiles on the two species Echinochloa crus-galli (L.)
        Beauv. and Avena fatua L. The most marked effects were caused by
        6F-4Pr-D-DIBOA, 6F-4Val-D-DIBOA, 6Cl-4Pr-D-DIBOA and 6Cl-4Val-D-DIBOA.
        The IC(50) values for the systems Echinochloa crus-galli-Oryza sativa
        and Avena fatua-Triticum aestivum for all compounds were compared. The
        compound that showed the greatest selectivity for the system Echinochloa
        crus-galli-Oryza sativa was 8Cl-4Pr-D-DIBOA, which was 15 times more
        selective than the commercial herbicide propanil (Cotanil-35). With
        regard to the system Avena fatua-Triticum aestivum, the compounds that
        showed the highest selectivities were 8Cl-4Val-D-DIBOA and
        6F-4Pr-D-DIBOA. The results obtained for 6F-4Pr-D-DIBOA are of great
        interest because of the high phytotoxicity to Avena fatua (IC(50) = 6
        µM, r(2) = 0.9616).
      - >-
        To find the equation of the line passing through points $P=(1,2)$ and
        $Q=(-1,2)$, we first find the slope of the line. Since the y-coordinates
        of both points are the same, the slope is 0. Therefore, the line is
        horizontal and its equation is given by:


        $y = 2$


        Now, we want to find the point $R$ where this line intersects the
        elliptic curve $y^2 = x^3 - 2x + 5$. Since we know that $y=2$, we can
        substitute this value into the equation of the curve:


        $(2)^2 = x^3 - 2x + 5$


        Simplifying, we get:


        $4 = x^3 - 2x + 5$


        Rearranging the terms, we have:


        $x^3 - 2x + 1 = 0$


        We know that $x=1$ and $x=-1$ are solutions to this equation since they
        correspond to the points $P$ and $Q$. To find the third solution, we can
        use synthetic division or factor the polynomial. Factoring, we get:


        $(x-1)(x+1)(x-1) = 0$


        So, the third solution is $x=1$. Substituting this value back into the
        equation of the line, we find the corresponding y-coordinate:


        $y = 2$


        Thus, the third point of intersection is $R=(1,2)$. However, in the
        context of elliptic curves, we should take the "sum" of the points $P$
        and $Q$ as the negative of the third intersection point. Since
        $R=(1,2)$, the negative of this point is given by $-R=(1,-2)$.
        Therefore, the "sum" of the points $P$ and $Q$ on the elliptic curve is:


        $P + Q = -R = (1,-2)$.
      - >-
        The use of geospatial analysis may be subject to regulatory compliance
        depending on the specific application and the jurisdiction in which it
        is used. For example, the use of geospatial data for marketing purposes
        may be subject to privacy regulations, and the use of geospatial data
        for land use planning may be subject to environmental regulations. It is
        important to consult with legal counsel to ensure compliance with all
        applicable laws and regulations.
  - source_sentence: >-
      Does sLEDAI-2K Conceal Worsening in a Particular System When There Is
      Overall Improvement?
    sentences:
      - >-
        To determine whether the Systemic Lupus Erythematosus Disease Activity
        Index 2000 (SLEDAI-2K) is valid in identifying patients who had a
        clinically important overall improvement with no worsening in other
        descriptors/systems. Consecutive patients with systemic lupus
        erythematosus with active disease who attended the Lupus Clinic between
        2000 and 2012 were studied. Based on the change in the total SLEDAI-2K
        scores on last visit, patients were grouped as improved,
        flared/worsened, and unchanged. Patients showing improvement were
        evaluated for the presence of new active descriptors at last visit
        compared with baseline visit. Of the 158 patients studied, 109 patients
        had improved, 38 remained unchanged, and 11 flared/worsened at last
        visit. In the improved group, 11 patients had a new laboratory
        descriptor that was not present at baseline visit. In those 11 patients,
        this new laboratory descriptor was not clinically significant and did
        not require a change in disease management.
      - >-
        To find the dot product of two vectors using their magnitudes, angle
        between them, and trigonometry, we can use the formula:


        Dot product = |A| * |B| * cos(θ)


        where |A| and |B| are the magnitudes of the vectors, and θ is the angle
        between them.


        In this case, |A| = 5 units, |B| = 8 units, and θ = 60 degrees.


        First, we need to convert the angle from degrees to radians:


        θ = 60 *  / 180) = π / 3 radians


        Now, we can find the dot product:


        Dot product = 5 * 8 * cos(π / 3)

        Dot product = 40 * (1/2)

        Dot product = 20


        So, the dot product of the two vectors is 20.
      - >-
        To determine if hospitals that routinely discharge patients early after
        lobectomy have increased readmissions. Hospitals are increasingly
        motivated to reduce length of stay (LOS) after lung cancer surgery, yet
        it is unclear if a routine of early discharge is associated with
        increased readmissions. The relationship between hospital discharge
        practices and readmission rates is therefore of tremendous clinical and
        financial importance. The National Cancer Database was queried for
        patients undergoing lobectomy for lung cancer from 2004 to 2013 at
        Commission on Cancer-accredited hospitals, which performed at least 25
        lobectomies in a 2-year period. Facility discharge practices were
        characterized by a facility's median LOS relative to the median LOS for
        all patients in that same time period. In all, 59,734 patients met
        inclusion criteria; 2687 (4.5%) experienced an unplanned readmission. In
        a hierarchical logistic regression model, a routine of early discharge
        (defined as a facility's tendency to discharge patients faster than the
        population median in the same time period) was not associated with
        increased risk of readmission (odds ratio 1.12, 95% confidence interval
        0.97-1.28, P = 0.12). In a risk-adjusted hospital readmission rate
        analysis, hospitals that discharged patients early did not experience
        more readmissions (P = 0.39). The lack of effect of early discharge
        practices on readmission rates was observed for both minimally invasive
        and thoracotomy approaches.
  - source_sentence: >-
      Does systemic administration of urocortin after intracerebral hemorrhage
      reduce neurological deficits and neuroinflammation in rats?
    sentences:
      - >-
        Intracerebral hemorrhage (ICH) remains a serious clinical problem
        lacking effective treatment. Urocortin (UCN), a novel anti-inflammatory
        neuropeptide, protects injured cardiomyocytes and dopaminergic neurons.
        Our preliminary studies indicate UCN alleviates ICH-induced brain injury
        when administered intracerebroventricularly (ICV). The present study
        examines the therapeutic effect of UCN on ICH-induced neurological
        deficits and neuroinflammation when administered by the more convenient
        intraperitoneal (i.p.) route. ICH was induced in male Sprague-Dawley
        rats by intrastriatal infusion of bacterial collagenase VII-S or
        autologous blood. UCN (2.5 or 25 μg/kg) was administered i.p. at 60
        minutes post-ICH. Penetration of i.p. administered fluorescently labeled
        UCN into the striatum was examined by fluorescence microscopy.
        Neurological deficits were evaluated by modified neurological severity
        score (mNSS). Brain edema was assessed using the dry/wet method.
        Blood-brain barrier (BBB) disruption was assessed using the Evans blue
        assay. Hemorrhagic volume and lesion volume were assessed by Drabkin's
        method and morphometric assay, respectively. Pro-inflammatory cytokine
        (TNF-α, IL-1β, and IL-6) expression was evaluated by enzyme-linked
        immunosorbent assay (ELISA). Microglial activation and neuronal loss
        were evaluated by immunohistochemistry. Administration of UCN reduced
        neurological deficits from 1 to 7 days post-ICH. Surprisingly, although
        a higher dose (25 μg/kg, i.p.) also reduced the functional deficits
        associated with ICH, it is significantly less effective than the lower
        dose (2.5 μg/kg, i.p.). Beneficial results with the low dose of UCN
        included a reduction in neurological deficits from 1 to 7 days post-ICH,
        as well as a reduction in brain edema, BBB disruption, lesion volume,
        microglial activation and neuronal loss 3 days post-ICH, and suppression
        of TNF-α, IL-1β, and IL-6 production 1, 3 and 7 days post-ICH.
      - >-
        A perfect number is a positive integer that is equal to the sum of its
        proper divisors (excluding itself). The first perfect numbers are 6, 28,
        496, and 8128. Perfect numbers can be generated using the formula
        2^(p-1) * (2^p - 1), where p and 2^p - 1 are both prime numbers.


        The first five (p, 2^p - 1) pairs are:

        (2, 3) - 6

        (3, 7) - 28

        (5, 31) - 496

        (7, 127) - 8128

        (13, 8191) - 33,550,336


        To find the 6th perfect number, we need to find the next prime number p
        such that 2^p - 1 is also prime. The next such pair is (17, 131071).
        Using the formula:


        2^(17-1) * (2^17 - 1) = 2^16 * 131071 = 65,536 * 131071 = 8,589,869,056


        So, the 6th perfect number is 8,589,869,056.
      - >-
        In type theory, the successor function $S$ is used to represent the next
        number in the sequence. When you apply the successor function $S$ three
        times to the number 0, you get:


        1. $S(0)$, which represents 1.

        2. $S(S(0))$, which represents 2.

        3. $S(S(S(0)))$, which represents 3.


        So, the result of applying the successor function $S$ three times to the
        number 0 in type theory is 3.
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
  - cosine_accuracy@1
  - cosine_accuracy@3
  - cosine_accuracy@5
  - cosine_accuracy@10
  - cosine_precision@1
  - cosine_precision@3
  - cosine_precision@5
  - cosine_recall@1
  - cosine_recall@3
  - cosine_recall@5
  - cosine_ndcg@10
  - cosine_mrr@10
  - cosine_map@100
model-index:
  - name: SentenceTransformer based on thenlper/gte-small
    results:
      - task:
          type: logging
          name: Logging
        dataset:
          name: ir eval
          type: ir-eval
        metrics:
          - type: cosine_accuracy@1
            value: 0.9291020819957809
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.9819315784646427
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.9933963129413923
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9984407961111621
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.9291020819957809
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.32731052615488093
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.19867926258827848
            name: Cosine Precision@5
          - type: cosine_recall@1
            value: 0.9291020819957809
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.9819315784646427
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.9933963129413923
            name: Cosine Recall@5
          - type: cosine_ndcg@10
            value: 0.9670096227619588
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.9565327512887825
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.9565967419425125
            name: Cosine Map@100

SentenceTransformer based on thenlper/gte-small

This is a sentence-transformers model finetuned from thenlper/gte-small. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: thenlper/gte-small
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 384 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sucharush/gte_MNR")
# Run inference
sentences = [
    'Does systemic administration of urocortin after intracerebral hemorrhage reduce neurological deficits and neuroinflammation in rats?',
    "Intracerebral hemorrhage (ICH) remains a serious clinical problem lacking effective treatment. Urocortin (UCN), a novel anti-inflammatory neuropeptide, protects injured cardiomyocytes and dopaminergic neurons. Our preliminary studies indicate UCN alleviates ICH-induced brain injury when administered intracerebroventricularly (ICV). The present study examines the therapeutic effect of UCN on ICH-induced neurological deficits and neuroinflammation when administered by the more convenient intraperitoneal (i.p.) route. ICH was induced in male Sprague-Dawley rats by intrastriatal infusion of bacterial collagenase VII-S or autologous blood. UCN (2.5 or 25 μg/kg) was administered i.p. at 60 minutes post-ICH. Penetration of i.p. administered fluorescently labeled UCN into the striatum was examined by fluorescence microscopy. Neurological deficits were evaluated by modified neurological severity score (mNSS). Brain edema was assessed using the dry/wet method. Blood-brain barrier (BBB) disruption was assessed using the Evans blue assay. Hemorrhagic volume and lesion volume were assessed by Drabkin's method and morphometric assay, respectively. Pro-inflammatory cytokine (TNF-α, IL-1β, and IL-6) expression was evaluated by enzyme-linked immunosorbent assay (ELISA). Microglial activation and neuronal loss were evaluated by immunohistochemistry. Administration of UCN reduced neurological deficits from 1 to 7 days post-ICH. Surprisingly, although a higher dose (25 μg/kg, i.p.) also reduced the functional deficits associated with ICH, it is significantly less effective than the lower dose (2.5 μg/kg, i.p.). Beneficial results with the low dose of UCN included a reduction in neurological deficits from 1 to 7 days post-ICH, as well as a reduction in brain edema, BBB disruption, lesion volume, microglial activation and neuronal loss 3 days post-ICH, and suppression of TNF-α, IL-1β, and IL-6 production 1, 3 and 7 days post-ICH.",
    'In type theory, the successor function $S$ is used to represent the next number in the sequence. When you apply the successor function $S$ three times to the number 0, you get:\n\n1. $S(0)$, which represents 1.\n2. $S(S(0))$, which represents 2.\n3. $S(S(S(0)))$, which represents 3.\n\nSo, the result of applying the successor function $S$ three times to the number 0 in type theory is 3.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Logging

  • Dataset: ir-eval
  • Evaluated with main.LoggingEvaluator
Metric Value
cosine_accuracy@1 0.9291
cosine_accuracy@3 0.9819
cosine_accuracy@5 0.9934
cosine_accuracy@10 0.9984
cosine_precision@1 0.9291
cosine_precision@3 0.3273
cosine_precision@5 0.1987
cosine_recall@1 0.9291
cosine_recall@3 0.9819
cosine_recall@5 0.9934
cosine_ndcg@10 0.967
cosine_mrr@10 0.9565
cosine_map@100 0.9566

Training Details

Training Dataset

Unnamed Dataset

  • Size: 98,112 training samples
  • Columns: sentence_0 and sentence_1
  • Approximate statistics based on the first 1000 samples:
    sentence_0 sentence_1
    type string string
    details
    • min: 6 tokens
    • mean: 44.14 tokens
    • max: 512 tokens
    • min: 12 tokens
    • mean: 321.5 tokens
    • max: 512 tokens
  • Samples:
    sentence_0 sentence_1
    Are transcobalamin II receptor polymorphisms associated with increased risk for neural tube defects? Women who have low cobalamin (vitamin B(12)) levels are at increased risk for having children with neural tube defects (NTDs). The transcobalamin II receptor (TCblR) mediates uptake of cobalamin into cells. Inherited variants in the TCblR gene as NTD risk factors were evaluated. Case-control and family-based tests of association were used to screen common variation in TCblR as genetic risk factors for NTDs in a large Irish group. A confirmatory group of NTD triads was used to test positive findings. 2 tightly linked variants associated with NTDs in a recessive model were found: TCblR rs2336573 (G220R; p(corr)=0.0080, corrected for multiple hypothesis testing) and TCblR rs9426 (p(corr)=0.0279). These variants were also associated with NTDs in a family-based test before multiple test correction (log-linear analysis of a recessive model: rs2336573 (G220R; RR=6.59, p=0.0037) and rs9426 (RR=6.71, p=0.0035)). A copy number variant distal to TCblR and two previously unreported exonic insertio...
    A company produces three products: Product A, B, and C. The monthly sales figures and marketing expenses (in thousands of dollars) for each product for the last six months are given below:

    Product
    Consider a basketball player who has a free-throw shooting percentage of 80%. The player attempts 10 free throws in a game.

    If the player makes a free throw, there is an 80% chance that they will make their next free throw attempt. If they miss a free throw, there's a 60% chance that they will make their next free throw attempt.

    What is the probability that the player makes exactly 7 out of their 10 free throw attempts?
    To solve this problem, we can use the concept of conditional probability and the binomial theorem. Let's denote the probability of making a free throw after a successful attempt as P(S) = 0.8 and the probability of making a free throw after a missed attempt as P(M) = 0.6.

    We need to find the probability of making exactly 7 out of 10 free throw attempts. There are multiple ways this can happen, and we need to consider all possible sequences of 7 successes (S) and 3 misses (M). We can represent these sequences as a string of S and M, for example, SSSSSSSMMM.

    There are C(10, 7) = 10! / (7! * 3!) = 120 ways to arrange 7 successes and 3 misses in a sequence of 10 attempts. For each of these sequences, we can calculate the probability of that specific sequence occurring and then sum up the probabilities of all sequences.

    Let's calculate the probability of a specific sequence. For example, consider the sequence SSSSSSSMMM. The probability of this sequence occurring is:

    P(SSSSSSSMMM) = P(S...
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 32
  • num_train_epochs: 1
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 32
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • tp_size: 0
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: round_robin

Training Logs

Epoch Step Training Loss ir-eval_cosine_ndcg@10
0.1631 500 0.0634 0.9563
0.3262 1000 0.005 0.9627
0.4892 1500 0.0037 0.9631
0.6523 2000 0.0029 0.9660
0.8154 2500 0.0033 0.9663
0.9785 3000 0.0027 0.9670
1.0 3066 - 0.9670

Framework Versions

  • Python: 3.12.8
  • Sentence Transformers: 3.4.1
  • Transformers: 4.51.3
  • PyTorch: 2.5.1+cu124
  • Accelerate: 1.3.0
  • Datasets: 3.2.0
  • Tokenizers: 0.21.0

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}