File size: 6,756 Bytes
77818de
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
---
license: mit
base_model: microsoft/MiniLM-L12-H384-uncased
tags:
- generated_from_trainer
metrics:
- f1
- accuracy
- precision
- recall
model-index:
- name: 016-microsoft-MiniLM-finetuned-yahoo-80_20
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# 016-microsoft-MiniLM-finetuned-yahoo-80_20

This model is a fine-tuned version of [microsoft/MiniLM-L12-H384-uncased](https://huggingface.co./microsoft/MiniLM-L12-H384-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6861
- F1: 0.4657
- Accuracy: 0.5
- Precision: 0.5267
- Recall: 0.5
- System Ram Used: 3.8760
- System Ram Total: 83.4807
- Gpu Ram Allocated: 0.3991
- Gpu Ram Cached: 1.9316
- Gpu Ram Total: 39.5640
- Gpu Utilization: 35
- Disk Space Used: 24.5397
- Disk Space Total: 78.1898

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100

### Training results

| Training Loss | Epoch | Step | Validation Loss | F1     | Accuracy | Precision | Recall | System Ram Used | System Ram Total | Gpu Ram Allocated | Gpu Ram Cached | Gpu Ram Total | Gpu Utilization | Disk Space Used | Disk Space Total |
|:-------------:|:-----:|:----:|:---------------:|:------:|:--------:|:---------:|:------:|:---------------:|:----------------:|:-----------------:|:--------------:|:-------------:|:---------------:|:---------------:|:----------------:|
| 2.3016        | 5.0   | 15   | 2.3016          | 0.0182 | 0.1      | 0.01      | 0.1    | 3.8589          | 83.4807          | 0.3990            | 1.9219         | 39.5640       | 38              | 24.5396         | 78.1898          |
| 2.2944        | 10.0  | 30   | 2.2979          | 0.0182 | 0.1      | 0.01      | 0.1    | 3.8753          | 83.4807          | 0.3991            | 1.9219         | 39.5640       | 36              | 24.5396         | 78.1898          |
| 2.2693        | 15.0  | 45   | 2.2696          | 0.2030 | 0.25     | 0.2472    | 0.25   | 3.8814          | 83.4807          | 0.3990            | 1.9316         | 39.5640       | 35              | 24.5396         | 78.1898          |
| 2.1627        | 20.0  | 60   | 2.2004          | 0.1808 | 0.25     | 0.1932    | 0.25   | 3.8785          | 83.4807          | 0.3990            | 1.9316         | 39.5640       | 39              | 24.5396         | 78.1898          |
| 1.9951        | 25.0  | 75   | 2.0773          | 0.2649 | 0.35     | 0.2922    | 0.35   | 3.8796          | 83.4807          | 0.3990            | 1.9316         | 39.5640       | 38              | 24.5396         | 78.1898          |
| 1.8128        | 30.0  | 90   | 1.9729          | 0.3619 | 0.45     | 0.3533    | 0.45   | 3.8802          | 83.4807          | 0.3990            | 1.9316         | 39.5640       | 36              | 24.5396         | 78.1898          |
| 1.6805        | 35.0  | 105  | 1.9061          | 0.4405 | 0.5      | 0.465     | 0.5    | 3.8803          | 83.4807          | 0.3990            | 1.9316         | 39.5640       | 37              | 24.5396         | 78.1898          |
| 1.5773        | 40.0  | 120  | 1.8512          | 0.3824 | 0.45     | 0.3767    | 0.45   | 3.8846          | 83.4807          | 0.3990            | 1.9316         | 39.5640       | 38              | 24.5396         | 78.1898          |
| 1.4916        | 45.0  | 135  | 1.8222          | 0.5190 | 0.55     | 0.5600    | 0.55   | 3.8846          | 83.4807          | 0.3991            | 1.9316         | 39.5640       | 40              | 24.5397         | 78.1898          |
| 1.4142        | 50.0  | 150  | 1.8056          | 0.4657 | 0.5      | 0.5267    | 0.5    | 3.8850          | 83.4807          | 0.3990            | 1.9316         | 39.5640       | 38              | 24.5397         | 78.1898          |
| 1.3555        | 55.0  | 165  | 1.7700          | 0.4657 | 0.5      | 0.5267    | 0.5    | 3.8850          | 83.4807          | 0.3991            | 1.9316         | 39.5640       | 41              | 24.5397         | 78.1898          |
| 1.3029        | 60.0  | 180  | 1.7568          | 0.4657 | 0.5      | 0.5267    | 0.5    | 3.8795          | 83.4807          | 0.3991            | 1.9316         | 39.5640       | 35              | 24.5397         | 78.1898          |
| 1.2572        | 65.0  | 195  | 1.7462          | 0.4371 | 0.45     | 0.5067    | 0.45   | 3.8802          | 83.4807          | 0.3991            | 1.9316         | 39.5640       | 40              | 24.5397         | 78.1898          |
| 1.2207        | 70.0  | 210  | 1.7215          | 0.4371 | 0.45     | 0.5067    | 0.45   | 3.8880          | 83.4807          | 0.3990            | 1.9316         | 39.5640       | 37              | 24.5397         | 78.1898          |
| 1.1915        | 75.0  | 225  | 1.7103          | 0.4657 | 0.5      | 0.5267    | 0.5    | 3.8760          | 83.4807          | 0.3991            | 1.9316         | 39.5640       | 39              | 24.5397         | 78.1898          |
| 1.1649        | 80.0  | 240  | 1.7069          | 0.4371 | 0.45     | 0.5067    | 0.45   | 3.8761          | 83.4807          | 0.3990            | 1.9316         | 39.5640       | 40              | 24.5397         | 78.1898          |
| 1.1484        | 85.0  | 255  | 1.6911          | 0.4657 | 0.5      | 0.5267    | 0.5    | 3.8747          | 83.4807          | 0.3991            | 1.9316         | 39.5640       | 35              | 24.5397         | 78.1898          |
| 1.135         | 90.0  | 270  | 1.6888          | 0.4657 | 0.5      | 0.5267    | 0.5    | 3.8753          | 83.4807          | 0.3990            | 1.9316         | 39.5640       | 37              | 24.5397         | 78.1898          |
| 1.1226        | 95.0  | 285  | 1.6860          | 0.4657 | 0.5      | 0.5267    | 0.5    | 3.8755          | 83.4807          | 0.3990            | 1.9316         | 39.5640       | 39              | 24.5397         | 78.1898          |
| 1.1217        | 100.0 | 300  | 1.6861          | 0.4657 | 0.5      | 0.5267    | 0.5    | 3.8755          | 83.4807          | 0.3990            | 1.9316         | 39.5640       | 38              | 24.5397         | 78.1898          |


### Framework versions

- Transformers 4.31.0
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3