metadata
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:98112
- loss:MultipleNegativesRankingLoss
base_model: thenlper/gte-small
widget:
- source_sentence: How does a photocell control outdoor lighting?
sentences:
- >-
To solve this problem, we can use the binomial probability formula:
P(X = k) = C(n, k) * p^k * (1-p)^(n-k)
where:
- P(X = k) is the probability of exactly k successes (faulty keyboards)
in n trials (laptops produced)
- C(n, k) is the number of combinations of n items taken k at a time (n!
/ (k!(n-k)!))
- p is the probability of success (5% or 0.05)
- n is the number of trials (400 laptops)
- k is the number of successes (20 faulty keyboards)
However, we want to find the probability of at least 20 faulty
keyboards, so we need to find the sum of probabilities for k = 20, 21,
22, ..., 400.
P(X >= 20) = 1 - P(X < 20) = 1 - Σ P(X = k) for k = 0 to 19
Now, we can calculate the probabilities for each value of k and sum them
up:
P(X >= 20) = 1 - Σ C(400, k) * 0.05^k * 0.95^(400-k) for k = 0 to 19
Using a calculator or software to compute the sum, we get:
P(X >= 20) ≈ 1 - 0.0184 = 0.9816
So, the probability that at least 20 laptops will have a faulty keyboard
is approximately 98.16%.
- >-
A photocell controls outdoor lighting by detecting the level of ambient
light. It automatically turns the lights on when it becomes dark and off
when it becomes light, functioning as a light-dependent switch for
energy efficiency and convenience.
- >-
Glycosylation with β-N-acetylglucosamine (O-GlcNAcylation) is one of the
most complex post-translational modifications. The cycling of O-GlcNAc
is controlled by two enzymes: UDP-NAc transferase (OGT) and O-GlcNAcase
(OGA). We recently reported that endothelin-1 (ET-1) augments vascular
levels of O-GlcNAcylated proteins. Here we tested the hypothesis that
O-GlcNAcylation contributes to the vascular effects of ET-1 via
activation of the RhoA/Rho-kinase pathway. Incubation of vascular smooth
muscle cells (VSMCs) with ET-1 (0.1 μM) produces a time-dependent
increase in O-GlcNAc levels. ET-1-induced O-GlcNAcylation is not
observed when VSMCs are previously transfected with OGT siRNA, treated
with ST045849 (OGT inhibitor) or atrasentan (ET(A) antagonist). ET-1 as
well as PugNAc (OGA inhibitor) augmented contractions to phenylephrine
in endothelium-denuded rat aortas, an effect that was abolished by the
Rho kinase inhibitor Y-27632. Incubation of VSMCs with ET-1 increased
expression of the phosphorylated forms of myosin phosphatase target
subunit 1 (MYPT-1), protein kinase C-potentiated protein phosphatase 1
inhibitor protein (protein kinase C-potentiated phosphatase
inhibitor-17), and myosin light chain (MLC) and RhoA expression and
activity, and this effect was abolished by both OGT siRNA transfection
or OGT inhibition and atrasentan. ET-1 also augmented expression of
PDZ-Rho GEF (guanine nucleotide exchange factor) and p115-Rho GEF in
VSMCs and this was prevented by OGT siRNA, ST045849, and atrasentan.
- source_sentence: >-
A torus has a major radius of 5 cm and a minor radius of 3 cm. Find the
volume of the torus.
sentences:
- >-
To find the Hausdorff dimension of the Koch curve, we can use the
formula:
Hausdorff dimension (D) = log(N) / log(1/s)
where N is the number of self-similar pieces and s is the scaling
factor.
For the Koch curve, each line segment is divided into four segments,
each of which is 1/3 the length of the original segment. Therefore, N =
4 and s = 1/3.
Now, we can plug these values into the formula:
D = log(4) / log(1/3)
D ≈ 1.2619
So, the Hausdorff dimension of the Koch curve is approximately 1.2619.
- >-
To find the volume of a torus, we can use the formula:
Volume = (π * minor_radius^2) * (2 * π * major_radius)
where minor_radius is the minor radius of the torus and major_radius is
the major radius of the torus.
Given that the major radius is 5 cm and the minor radius is 3 cm, we can
plug these values into the formula:
Volume = (π * 3^2) * (2 * π * 5)
Volume = (π * 9) * (10 * π)
Volume = 90 * π^2
The volume of the torus is approximately 282.74 cubic centimeters.
- >-
The purpose of the present study was to elucidate the mechanisms of
action mediating enhancement of basal glucose uptake in skeletal muscle
cells by seven medicinal plant products recently identified from the
pharmacopeia of native Canadian populations (Spoor et al., 2006).
Activity of the major signaling pathways that regulate glucose uptake
was assessed by western immunoblot in C2C12 muscle cells treated with
extracts from these plant species. Effects of extracts on mitochondrial
function were assessed by respirometry in isolated rat liver
mitochondria. Metabolic stress induced by extracts was assessed by
measuring ATP concentration and rate of cell medium acidification in
C2C12 myotubes and H4IIE hepatocytes. Extracts were applied at a dose of
15-100 microg/ml. The effect of all seven products was achieved through
a common mechanism mediated not by the insulin signaling pathway but
rather by the AMP-activated protein kinase (AMPK) pathway in response to
the disruption of mitochondrial function and ensuing metabolic stress.
Disruption of mitochondrial function occurred in the form of uncoupling
of oxidative phosphorylation and/or inhibition of ATPsynthase. Activity
of the AMPK pathway, in some instances comparable to that stimulated by
4mM of the AMP-mimetic AICAR, was in several cases sustained for at
least 18h post-treatment. Duration of metabolic stress, however, was in
most cases in the order of 1h.
- source_sentence: >-
Consider the elliptic curve given by the equation $y^2=x^3-2x+5$ over the
field of rational numbers $\mathbb{Q}$. Let $P=(1,2)$ and $Q=(-1,2)$ be
two points on the curve. Find the equation of the line passing through $P$
and $Q$ and show that it intersects the curve at another point $R$. Then,
find the coordinates of the point $R$.
sentences:
- >-
Fifteen novel derivatives of D-DIBOA, including aromatic ring
modifications and the addition of side chains in positions C-2 and N-4,
had previously been synthesised and their phytotoxicity on standard
target species (STS) evaluated. This strategy combined steric,
electronic, solubility and lipophilicity requirements to achieve the
maximum phytotoxic activity. An evaluation of the bioactivity of these
compounds on the systems Oryza sativa-Echinochloa crus-galli and
Triticum aestivum-Avena fatua is reported here. All compounds showed
inhibition profiles on the two species Echinochloa crus-galli (L.)
Beauv. and Avena fatua L. The most marked effects were caused by
6F-4Pr-D-DIBOA, 6F-4Val-D-DIBOA, 6Cl-4Pr-D-DIBOA and 6Cl-4Val-D-DIBOA.
The IC(50) values for the systems Echinochloa crus-galli-Oryza sativa
and Avena fatua-Triticum aestivum for all compounds were compared. The
compound that showed the greatest selectivity for the system Echinochloa
crus-galli-Oryza sativa was 8Cl-4Pr-D-DIBOA, which was 15 times more
selective than the commercial herbicide propanil (Cotanil-35). With
regard to the system Avena fatua-Triticum aestivum, the compounds that
showed the highest selectivities were 8Cl-4Val-D-DIBOA and
6F-4Pr-D-DIBOA. The results obtained for 6F-4Pr-D-DIBOA are of great
interest because of the high phytotoxicity to Avena fatua (IC(50) = 6
µM, r(2) = 0.9616).
- >-
To find the equation of the line passing through points $P=(1,2)$ and
$Q=(-1,2)$, we first find the slope of the line. Since the y-coordinates
of both points are the same, the slope is 0. Therefore, the line is
horizontal and its equation is given by:
$y = 2$
Now, we want to find the point $R$ where this line intersects the
elliptic curve $y^2 = x^3 - 2x + 5$. Since we know that $y=2$, we can
substitute this value into the equation of the curve:
$(2)^2 = x^3 - 2x + 5$
Simplifying, we get:
$4 = x^3 - 2x + 5$
Rearranging the terms, we have:
$x^3 - 2x + 1 = 0$
We know that $x=1$ and $x=-1$ are solutions to this equation since they
correspond to the points $P$ and $Q$. To find the third solution, we can
use synthetic division or factor the polynomial. Factoring, we get:
$(x-1)(x+1)(x-1) = 0$
So, the third solution is $x=1$. Substituting this value back into the
equation of the line, we find the corresponding y-coordinate:
$y = 2$
Thus, the third point of intersection is $R=(1,2)$. However, in the
context of elliptic curves, we should take the "sum" of the points $P$
and $Q$ as the negative of the third intersection point. Since
$R=(1,2)$, the negative of this point is given by $-R=(1,-2)$.
Therefore, the "sum" of the points $P$ and $Q$ on the elliptic curve is:
$P + Q = -R = (1,-2)$.
- >-
The use of geospatial analysis may be subject to regulatory compliance
depending on the specific application and the jurisdiction in which it
is used. For example, the use of geospatial data for marketing purposes
may be subject to privacy regulations, and the use of geospatial data
for land use planning may be subject to environmental regulations. It is
important to consult with legal counsel to ensure compliance with all
applicable laws and regulations.
- source_sentence: >-
Does sLEDAI-2K Conceal Worsening in a Particular System When There Is
Overall Improvement?
sentences:
- >-
To determine whether the Systemic Lupus Erythematosus Disease Activity
Index 2000 (SLEDAI-2K) is valid in identifying patients who had a
clinically important overall improvement with no worsening in other
descriptors/systems. Consecutive patients with systemic lupus
erythematosus with active disease who attended the Lupus Clinic between
2000 and 2012 were studied. Based on the change in the total SLEDAI-2K
scores on last visit, patients were grouped as improved,
flared/worsened, and unchanged. Patients showing improvement were
evaluated for the presence of new active descriptors at last visit
compared with baseline visit. Of the 158 patients studied, 109 patients
had improved, 38 remained unchanged, and 11 flared/worsened at last
visit. In the improved group, 11 patients had a new laboratory
descriptor that was not present at baseline visit. In those 11 patients,
this new laboratory descriptor was not clinically significant and did
not require a change in disease management.
- >-
To find the dot product of two vectors using their magnitudes, angle
between them, and trigonometry, we can use the formula:
Dot product = |A| * |B| * cos(θ)
where |A| and |B| are the magnitudes of the vectors, and θ is the angle
between them.
In this case, |A| = 5 units, |B| = 8 units, and θ = 60 degrees.
First, we need to convert the angle from degrees to radians:
θ = 60 * (π / 180) = π / 3 radians
Now, we can find the dot product:
Dot product = 5 * 8 * cos(π / 3)
Dot product = 40 * (1/2)
Dot product = 20
So, the dot product of the two vectors is 20.
- >-
To determine if hospitals that routinely discharge patients early after
lobectomy have increased readmissions. Hospitals are increasingly
motivated to reduce length of stay (LOS) after lung cancer surgery, yet
it is unclear if a routine of early discharge is associated with
increased readmissions. The relationship between hospital discharge
practices and readmission rates is therefore of tremendous clinical and
financial importance. The National Cancer Database was queried for
patients undergoing lobectomy for lung cancer from 2004 to 2013 at
Commission on Cancer-accredited hospitals, which performed at least 25
lobectomies in a 2-year period. Facility discharge practices were
characterized by a facility's median LOS relative to the median LOS for
all patients in that same time period. In all, 59,734 patients met
inclusion criteria; 2687 (4.5%) experienced an unplanned readmission. In
a hierarchical logistic regression model, a routine of early discharge
(defined as a facility's tendency to discharge patients faster than the
population median in the same time period) was not associated with
increased risk of readmission (odds ratio 1.12, 95% confidence interval
0.97-1.28, P = 0.12). In a risk-adjusted hospital readmission rate
analysis, hospitals that discharged patients early did not experience
more readmissions (P = 0.39). The lack of effect of early discharge
practices on readmission rates was observed for both minimally invasive
and thoracotomy approaches.
- source_sentence: >-
Does systemic administration of urocortin after intracerebral hemorrhage
reduce neurological deficits and neuroinflammation in rats?
sentences:
- >-
Intracerebral hemorrhage (ICH) remains a serious clinical problem
lacking effective treatment. Urocortin (UCN), a novel anti-inflammatory
neuropeptide, protects injured cardiomyocytes and dopaminergic neurons.
Our preliminary studies indicate UCN alleviates ICH-induced brain injury
when administered intracerebroventricularly (ICV). The present study
examines the therapeutic effect of UCN on ICH-induced neurological
deficits and neuroinflammation when administered by the more convenient
intraperitoneal (i.p.) route. ICH was induced in male Sprague-Dawley
rats by intrastriatal infusion of bacterial collagenase VII-S or
autologous blood. UCN (2.5 or 25 μg/kg) was administered i.p. at 60
minutes post-ICH. Penetration of i.p. administered fluorescently labeled
UCN into the striatum was examined by fluorescence microscopy.
Neurological deficits were evaluated by modified neurological severity
score (mNSS). Brain edema was assessed using the dry/wet method.
Blood-brain barrier (BBB) disruption was assessed using the Evans blue
assay. Hemorrhagic volume and lesion volume were assessed by Drabkin's
method and morphometric assay, respectively. Pro-inflammatory cytokine
(TNF-α, IL-1β, and IL-6) expression was evaluated by enzyme-linked
immunosorbent assay (ELISA). Microglial activation and neuronal loss
were evaluated by immunohistochemistry. Administration of UCN reduced
neurological deficits from 1 to 7 days post-ICH. Surprisingly, although
a higher dose (25 μg/kg, i.p.) also reduced the functional deficits
associated with ICH, it is significantly less effective than the lower
dose (2.5 μg/kg, i.p.). Beneficial results with the low dose of UCN
included a reduction in neurological deficits from 1 to 7 days post-ICH,
as well as a reduction in brain edema, BBB disruption, lesion volume,
microglial activation and neuronal loss 3 days post-ICH, and suppression
of TNF-α, IL-1β, and IL-6 production 1, 3 and 7 days post-ICH.
- >-
A perfect number is a positive integer that is equal to the sum of its
proper divisors (excluding itself). The first perfect numbers are 6, 28,
496, and 8128. Perfect numbers can be generated using the formula
2^(p-1) * (2^p - 1), where p and 2^p - 1 are both prime numbers.
The first five (p, 2^p - 1) pairs are:
(2, 3) - 6
(3, 7) - 28
(5, 31) - 496
(7, 127) - 8128
(13, 8191) - 33,550,336
To find the 6th perfect number, we need to find the next prime number p
such that 2^p - 1 is also prime. The next such pair is (17, 131071).
Using the formula:
2^(17-1) * (2^17 - 1) = 2^16 * 131071 = 65,536 * 131071 = 8,589,869,056
So, the 6th perfect number is 8,589,869,056.
- >-
In type theory, the successor function $S$ is used to represent the next
number in the sequence. When you apply the successor function $S$ three
times to the number 0, you get:
1. $S(0)$, which represents 1.
2. $S(S(0))$, which represents 2.
3. $S(S(S(0)))$, which represents 3.
So, the result of applying the successor function $S$ three times to the
number 0 in type theory is 3.
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
model-index:
- name: SentenceTransformer based on thenlper/gte-small
results:
- task:
type: logging
name: Logging
dataset:
name: ir eval
type: ir-eval
metrics:
- type: cosine_accuracy@1
value: 0.9291020819957809
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.9819315784646427
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.9933963129413923
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.9984407961111621
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.9291020819957809
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.32731052615488093
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.19867926258827848
name: Cosine Precision@5
- type: cosine_recall@1
value: 0.9291020819957809
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.9819315784646427
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.9933963129413923
name: Cosine Recall@5
- type: cosine_ndcg@10
value: 0.9670096227619588
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.9565327512887825
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.9565967419425125
name: Cosine Map@100
SentenceTransformer based on thenlper/gte-small
This is a sentence-transformers model finetuned from thenlper/gte-small. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: thenlper/gte-small
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 384 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("sucharush/gte_MNR")
# Run inference
sentences = [
'Does systemic administration of urocortin after intracerebral hemorrhage reduce neurological deficits and neuroinflammation in rats?',
"Intracerebral hemorrhage (ICH) remains a serious clinical problem lacking effective treatment. Urocortin (UCN), a novel anti-inflammatory neuropeptide, protects injured cardiomyocytes and dopaminergic neurons. Our preliminary studies indicate UCN alleviates ICH-induced brain injury when administered intracerebroventricularly (ICV). The present study examines the therapeutic effect of UCN on ICH-induced neurological deficits and neuroinflammation when administered by the more convenient intraperitoneal (i.p.) route. ICH was induced in male Sprague-Dawley rats by intrastriatal infusion of bacterial collagenase VII-S or autologous blood. UCN (2.5 or 25 μg/kg) was administered i.p. at 60 minutes post-ICH. Penetration of i.p. administered fluorescently labeled UCN into the striatum was examined by fluorescence microscopy. Neurological deficits were evaluated by modified neurological severity score (mNSS). Brain edema was assessed using the dry/wet method. Blood-brain barrier (BBB) disruption was assessed using the Evans blue assay. Hemorrhagic volume and lesion volume were assessed by Drabkin's method and morphometric assay, respectively. Pro-inflammatory cytokine (TNF-α, IL-1β, and IL-6) expression was evaluated by enzyme-linked immunosorbent assay (ELISA). Microglial activation and neuronal loss were evaluated by immunohistochemistry. Administration of UCN reduced neurological deficits from 1 to 7 days post-ICH. Surprisingly, although a higher dose (25 μg/kg, i.p.) also reduced the functional deficits associated with ICH, it is significantly less effective than the lower dose (2.5 μg/kg, i.p.). Beneficial results with the low dose of UCN included a reduction in neurological deficits from 1 to 7 days post-ICH, as well as a reduction in brain edema, BBB disruption, lesion volume, microglial activation and neuronal loss 3 days post-ICH, and suppression of TNF-α, IL-1β, and IL-6 production 1, 3 and 7 days post-ICH.",
'In type theory, the successor function $S$ is used to represent the next number in the sequence. When you apply the successor function $S$ three times to the number 0, you get:\n\n1. $S(0)$, which represents 1.\n2. $S(S(0))$, which represents 2.\n3. $S(S(S(0)))$, which represents 3.\n\nSo, the result of applying the successor function $S$ three times to the number 0 in type theory is 3.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Evaluation
Metrics
Logging
- Dataset:
ir-eval - Evaluated with
main.LoggingEvaluator
| Metric | Value |
|---|---|
| cosine_accuracy@1 | 0.9291 |
| cosine_accuracy@3 | 0.9819 |
| cosine_accuracy@5 | 0.9934 |
| cosine_accuracy@10 | 0.9984 |
| cosine_precision@1 | 0.9291 |
| cosine_precision@3 | 0.3273 |
| cosine_precision@5 | 0.1987 |
| cosine_recall@1 | 0.9291 |
| cosine_recall@3 | 0.9819 |
| cosine_recall@5 | 0.9934 |
| cosine_ndcg@10 | 0.967 |
| cosine_mrr@10 | 0.9565 |
| cosine_map@100 | 0.9566 |
Training Details
Training Dataset
Unnamed Dataset
- Size: 98,112 training samples
- Columns:
sentence_0andsentence_1 - Approximate statistics based on the first 1000 samples:
sentence_0 sentence_1 type string string details - min: 6 tokens
- mean: 44.14 tokens
- max: 512 tokens
- min: 12 tokens
- mean: 321.5 tokens
- max: 512 tokens
- Samples:
sentence_0 sentence_1 Are transcobalamin II receptor polymorphisms associated with increased risk for neural tube defects?Women who have low cobalamin (vitamin B(12)) levels are at increased risk for having children with neural tube defects (NTDs). The transcobalamin II receptor (TCblR) mediates uptake of cobalamin into cells. Inherited variants in the TCblR gene as NTD risk factors were evaluated. Case-control and family-based tests of association were used to screen common variation in TCblR as genetic risk factors for NTDs in a large Irish group. A confirmatory group of NTD triads was used to test positive findings. 2 tightly linked variants associated with NTDs in a recessive model were found: TCblR rs2336573 (G220R; p(corr)=0.0080, corrected for multiple hypothesis testing) and TCblR rs9426 (p(corr)=0.0279). These variants were also associated with NTDs in a family-based test before multiple test correction (log-linear analysis of a recessive model: rs2336573 (G220R; RR=6.59, p=0.0037) and rs9426 (RR=6.71, p=0.0035)). A copy number variant distal to TCblR and two previously unreported exonic insertio...A company produces three products: Product A, B, and C. The monthly sales figures and marketing expenses (in thousands of dollars) for each product for the last six months are given below:Product Consider a basketball player who has a free-throw shooting percentage of 80%. The player attempts 10 free throws in a game.
If the player makes a free throw, there is an 80% chance that they will make their next free throw attempt. If they miss a free throw, there's a 60% chance that they will make their next free throw attempt.
What is the probability that the player makes exactly 7 out of their 10 free throw attempts?To solve this problem, we can use the concept of conditional probability and the binomial theorem. Let's denote the probability of making a free throw after a successful attempt as P(S) = 0.8 and the probability of making a free throw after a missed attempt as P(M) = 0.6.
We need to find the probability of making exactly 7 out of 10 free throw attempts. There are multiple ways this can happen, and we need to consider all possible sequences of 7 successes (S) and 3 misses (M). We can represent these sequences as a string of S and M, for example, SSSSSSSMMM.
There are C(10, 7) = 10! / (7! * 3!) = 120 ways to arrange 7 successes and 3 misses in a sequence of 10 attempts. For each of these sequences, we can calculate the probability of that specific sequence occurring and then sum up the probabilities of all sequences.
Let's calculate the probability of a specific sequence. For example, consider the sequence SSSSSSSMMM. The probability of this sequence occurring is:
P(SSSSSSSMMM) = P(S... - Loss:
MultipleNegativesRankingLosswith these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy: stepsper_device_train_batch_size: 32per_device_eval_batch_size: 32num_train_epochs: 1batch_sampler: no_duplicatesmulti_dataset_batch_sampler: round_robin
All Hyperparameters
Click to expand
overwrite_output_dir: Falsedo_predict: Falseeval_strategy: stepsprediction_loss_only: Trueper_device_train_batch_size: 32per_device_eval_batch_size: 32per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 1eval_accumulation_steps: Nonetorch_empty_cache_steps: Nonelearning_rate: 5e-05weight_decay: 0.0adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1num_train_epochs: 1max_steps: -1lr_scheduler_type: linearlr_scheduler_kwargs: {}warmup_ratio: 0.0warmup_steps: 0log_level: passivelog_level_replica: warninglog_on_each_node: Truelogging_nan_inf_filter: Truesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Falserestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 42data_seed: Nonejit_mode_eval: Falseuse_ipex: Falsebf16: Falsefp16: Falsefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Falsedataloader_num_workers: 0dataloader_prefetch_factor: Nonepast_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Falseignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}tp_size: 0fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torchoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthddp_find_unused_parameters: Noneddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Falseresume_from_checkpoint: Nonehub_model_id: Nonehub_strategy: every_savehub_private_repo: Nonehub_always_push: Falsegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseinclude_for_metrics: []eval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters:auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: Falseneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseeval_on_start: Falseuse_liger_kernel: Falseeval_use_gather_object: Falseaverage_tokens_across_devices: Falseprompts: Nonebatch_sampler: no_duplicatesmulti_dataset_batch_sampler: round_robin
Training Logs
| Epoch | Step | Training Loss | ir-eval_cosine_ndcg@10 |
|---|---|---|---|
| 0.1631 | 500 | 0.0634 | 0.9563 |
| 0.3262 | 1000 | 0.005 | 0.9627 |
| 0.4892 | 1500 | 0.0037 | 0.9631 |
| 0.6523 | 2000 | 0.0029 | 0.9660 |
| 0.8154 | 2500 | 0.0033 | 0.9663 |
| 0.9785 | 3000 | 0.0027 | 0.9670 |
| 1.0 | 3066 | - | 0.9670 |
Framework Versions
- Python: 3.12.8
- Sentence Transformers: 3.4.1
- Transformers: 4.51.3
- PyTorch: 2.5.1+cu124
- Accelerate: 1.3.0
- Datasets: 3.2.0
- Tokenizers: 0.21.0
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}