---
library_name: sentence-transformers
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:218496
- loss:MultipleNegativesRankingLoss
widget:
- source_sentence: "when dividing involving a multiple of 10, gives an answer 10 times\
\ bigger than it should be\n\ndivide decimals by 10(multiplying and dividing with\
\ decimals).\nquestion: 43.2 \\div 10= \ncorrect answer: 4.32 \nincorrect answer:\
\ 33.2"
sentences:
- Does not recognise that a shape translated would not change orientation
- Thinks you can find missing values in a given table by treating the row as linear
and adding on the difference between the first two values given.
- Subtracts instead of divides
- source_sentence: "incorrectly cancels what they believe is a factor in algebraic\
\ fractions\n\nsimplify an algebraic fraction by factorising the numerator(simplifying\
\ algebraic fractions).\nquestion: simplify the following, if possible: \\frac{m^{2}+2\
\ m-3}{m-3} \ncorrect answer: does not simplify\nincorrect answer: m+1"
sentences:
- Does not know units of area should be squared
- Thinks all lines on a net will form edges in 3D
- 'Does not know that to factorise a quadratic expression, to find two numbers that
add to give the coefficient of the x term, and multiply to give the non variable
term
'
- source_sentence: "believes that the order of operations does not affect the answer\
\ to a calculation\n\nuse the order of operations to carry out calculations involving\
\ powers(bidmas).\nquestion: \\[\n3 \\times 2+4-5\n\\]\nwhere do the brackets\
\ need to go to make the answer equal 13 ?\ncorrect answer: 3 \\times(2+4)-5 \n\
incorrect answer: does not need brackets"
sentences:
- Thinks that when you cancel identical terms from the numerator and denominator,
they just disappear
- Believes both the x and y co-ordinates of the x-intercept of a quadratic are derived
from the constants in the factorised form.
- 'Confuses the order of operations, believes addition comes before multiplication '
- source_sentence: "believes that the order of operations does not affect the answer\
\ to a calculation\n\nuse the order of operations to carry out calculations involving\
\ powers(bidmas).\nquestion: \\[\n3 \\times 2+4-5\n\\]\nwhere do the brackets\
\ need to go to make the answer equal 13 ?\ncorrect answer: 3 \\times(2+4)-5 \n\
incorrect answer: does not need brackets"
sentences:
- 'Confuses the order of operations, believes addition comes before multiplication '
- Does not recognise the properties of a kite
- 'Confuses the order of operations, believes addition comes before multiplication '
- source_sentence: "believes percentages cannot be converted into fractions\n\nconvert\
\ two digit integer percentages to fractions(converting between fractions and\
\ percentages).\nquestion: convert this percentage to a fraction\n 62 \\% \ncorrect\
\ answer: \\frac{31}{50} \nincorrect answer: none of these"
sentences:
- Believes the gradients of perpendicular lines are reciprocals of the same sign
- Does not know the properties of a rectangle
- Does not understand a percentage is out of 100
---
# SentenceTransformer
This is a [sentence-transformers](https://www.SBERT.net) model trained. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 1024 tokens
- **Similarity Function:** Cosine Similarity
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co./models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Gurveer05/gte-base-eedi-2024")
# Run inference
sentences = [
'believes percentages cannot be converted into fractions\n\nconvert two digit integer percentages to fractions(converting between fractions and percentages).\nquestion: convert this percentage to a fraction\n 62 \\% \ncorrect answer: \\frac{31}{50} \nincorrect answer: none of these',
'Does not understand a percentage is out of 100',
'Believes the gradients of perpendicular lines are reciprocals of the same sign',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 218,496 training samples
* Columns: FullText
, GroundTruthMisconception
, and PredictMisconception
* Approximate statistics based on the first 1000 samples:
| | FullText | GroundTruthMisconception | PredictMisconception |
|:--------|:------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string | string |
| details |
believes that the order of operations does not affect the answer to a calculation
use the order of operations to carry out calculations involving powers(bidmas).
question: \[
3 \times 2+4-5
\]
where do the brackets need to go to make the answer equal 13 ?
correct answer: 3 \times(2+4)-5
incorrect answer: does not need brackets
| Confuses the order of operations, believes addition comes before multiplication
| Believes infinite gradient is not possible in real life.
|
| believes that the order of operations does not affect the answer to a calculation
use the order of operations to carry out calculations involving powers(bidmas).
question: \[
3 \times 2+4-5
\]
where do the brackets need to go to make the answer equal 13 ?
correct answer: 3 \times(2+4)-5
incorrect answer: does not need brackets
| Confuses the order of operations, believes addition comes before multiplication
| Struggles to draw 3D shapes on isometric paper
|
| believes that the order of operations does not affect the answer to a calculation
use the order of operations to carry out calculations involving powers(bidmas).
question: \[
3 \times 2+4-5
\]
where do the brackets need to go to make the answer equal 13 ?
correct answer: 3 \times(2+4)-5
incorrect answer: does not need brackets
| Confuses the order of operations, believes addition comes before multiplication
| Believes an upward slope on a distance-time graph means travelling back towards the starting point.
|
* Loss: [MultipleNegativesRankingLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `gradient_accumulation_steps`: 16
- `eval_accumulation_steps`: 16
- `learning_rate`: 2e-05
- `weight_decay`: 0.01
- `num_train_epochs`: 2
- `lr_scheduler_type`: cosine_with_restarts
- `warmup_ratio`: 0.1
- `fp16`: True
- `batch_sampler`: no_duplicates
#### All Hyperparameters