File size: 3,838 Bytes
ecd71d0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
---
base_model: unsloth/gemma-2-9b
library_name: peft
license: gemma
tags:
- unsloth
- generated_from_trainer
model-index:
- name: gemma-2-9b_metamath_default
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# gemma-2-9b_metamath_default

This model is a fine-tuned version of [unsloth/gemma-2-9b](https://huggingface.co./unsloth/gemma-2-9b) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 10.8086

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 32
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.02
- num_epochs: 1

### Training results

| Training Loss | Epoch  | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 0.7172        | 0.0211 | 13   | 1.3535          |
| 1.2917        | 0.0421 | 26   | 1.8068          |
| 1.992         | 0.0632 | 39   | 3.1112          |
| 2.9907        | 0.0843 | 52   | 4.2249          |
| 4.8887        | 0.1053 | 65   | 11.0938         |
| 10.1689       | 0.1264 | 78   | 10.8044         |
| 10.1275       | 0.1474 | 91   | 10.8124         |
| 10.7459       | 0.1685 | 104  | 11.7846         |
| 11.8895       | 0.1896 | 117  | 12.0598         |
| 12.0118       | 0.2106 | 130  | 11.9669         |
| 11.9758       | 0.2317 | 143  | 11.9468         |
| 11.9457       | 0.2528 | 156  | 11.9325         |
| 11.8852       | 0.2738 | 169  | 11.7855         |
| 11.8173       | 0.2949 | 182  | 11.7749         |
| 11.7659       | 0.3159 | 195  | 11.7242         |
| 11.6916       | 0.3370 | 208  | 11.6713         |
| 11.7335       | 0.3581 | 221  | 11.6871         |
| 11.6317       | 0.3791 | 234  | 11.4314         |
| 11.4129       | 0.4002 | 247  | 11.2565         |
| 11.3691       | 0.4213 | 260  | 11.3368         |
| 11.5425       | 0.4423 | 273  | 11.5908         |
| 11.5115       | 0.4634 | 286  | 11.2716         |
| 11.398        | 0.4845 | 299  | 11.3439         |
| 11.3467       | 0.5055 | 312  | 11.3315         |
| 11.2106       | 0.5266 | 325  | 11.0557         |
| 11.3123       | 0.5476 | 338  | 11.2704         |
| 11.1559       | 0.5687 | 351  | 11.0238         |
| 10.996        | 0.5898 | 364  | 11.2103         |
| 11.1641       | 0.6108 | 377  | 10.9649         |
| 11.1403       | 0.6319 | 390  | 10.9743         |
| 10.9823       | 0.6530 | 403  | 11.0703         |
| 10.9891       | 0.6740 | 416  | 10.9547         |
| 10.932        | 0.6951 | 429  | 10.8953         |
| 11.0158       | 0.7162 | 442  | 10.9839         |
| 10.8677       | 0.7372 | 455  | 10.8881         |
| 10.9204       | 0.7583 | 468  | 10.9832         |
| 10.911        | 0.7793 | 481  | 10.9743         |
| 10.8973       | 0.8004 | 494  | 10.8108         |
| 10.7819       | 0.8215 | 507  | 10.9354         |
| 10.7843       | 0.8425 | 520  | 10.8324         |
| 10.8343       | 0.8636 | 533  | 10.9296         |
| 10.7756       | 0.8847 | 546  | 10.7629         |
| 10.781        | 0.9057 | 559  | 10.7852         |
| 10.806        | 0.9268 | 572  | 10.8285         |
| 10.7984       | 0.9478 | 585  | 10.7972         |
| 10.7722       | 0.9689 | 598  | 10.7924         |
| 10.7957       | 0.9900 | 611  | 10.8086         |


### Framework versions

- PEFT 0.12.0
- Transformers 4.44.0
- Pytorch 2.4.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1