File size: 5,898 Bytes
b10d85b
 
02e2bf9
b10d85b
 
 
 
 
 
 
 
 
 
 
 
02e2bf9
91b7878
02e2bf9
 
 
 
 
 
 
b10d85b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0c3c974
 
 
b10d85b
15e7e34
b10d85b
91b7878
b10d85b
15e7e34
 
91b7878
 
02e2bf9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
15e7e34
 
b10d85b
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
---
library_name: transformers
base_model: ellabettison/logo-matching-base
tags:
- generated_from_trainer
model-index:
- name: logo-matching-base
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# logo-matching-base

This model is a fine-tuned version of [ellabettison/logo-matching-base](https://huggingface.co./ellabettison/logo-matching-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Adjusted Rand Score: 0.1048
- Adjusted Mutual Info Score: 0.1103
- Homogeneity Score: 0.1072
- Completeness Score: 0.6399
- Fowlkes Mallows Score: 0.4934
- Pair Confusion Matrix: [[13442, 34208], [1176, 12430]]
- Loss: 0.1048

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20

### Training results

| Training Loss | Epoch | Step | Adjusted Rand Score | Adjusted Mutual Info Score | Homogeneity Score | Completeness Score | Fowlkes Mallows Score | Pair Confusion Matrix           | Validation Loss |
|:-------------:|:-----:|:----:|:-------------------:|:--------------------------:|:-----------------:|:------------------:|:---------------------:|:-------------------------------:|:---------------:|
| 0.0988        | 1.0   | 34   | 0.0667              | 0.1257                     | 0.1492            | 0.5530             | 0.4345                | [[17354, 30296], [3408, 10198]] | 0.0667          |
| 0.0918        | 2.0   | 68   | 0.0512              | 0.0801                     | 0.1155            | 0.4762             | 0.4303                | [[15840, 31810], [3306, 10300]] | 0.0512          |
| 0.0972        | 3.0   | 102  | 0.0494              | 0.1485                     | 0.2067            | 0.4977             | 0.3496                | [[27372, 20278], [6908, 6698]]  | 0.0494          |
| 0.1027        | 4.0   | 136  | 0.1550              | 0.1639                     | 0.2052            | 0.5160             | 0.4118                | [[30918, 16732], [6132, 7474]]  | 0.1550          |
| 0.0944        | 5.0   | 170  | 0.1798              | 0.1948                     | 0.2833            | 0.5017             | 0.3910                | [[35784, 11866], [7490, 6116]]  | 0.1798          |
| 0.09          | 6.0   | 204  | 0.1186              | 0.1355                     | 0.1836            | 0.5103             | 0.4414                | [[23042, 24608], [4096, 9510]]  | 0.1186          |
| 0.0858        | 7.0   | 238  | 0.0756              | 0.1533                     | 0.2163            | 0.4937             | 0.3570                | [[29292, 18358], [7030, 6576]]  | 0.0756          |
| 0.0735        | 8.0   | 272  | 0.1107              | 0.1555                     | 0.2280            | 0.4815             | 0.3651                | [[31962, 15688], [7288, 6318]]  | 0.1107          |
| 0.0868        | 9.0   | 306  | 0.2523              | 0.2273                     | 0.3022            | 0.5371             | 0.4428                | [[36956, 10694], [6766, 6840]]  | 0.2523          |
| 0.0932        | 10.0  | 340  | 0.2318              | 0.2251                     | 0.3703            | 0.4901             | 0.3838                | [[41708, 5942], [9010, 4596]]   | 0.2318          |
| 0.0642        | 11.0  | 374  | 0.1840              | 0.1915                     | 0.2284            | 0.5528             | 0.4331                | [[31348, 16302], [5754, 7852]]  | 0.1840          |
| 0.0918        | 12.0  | 408  | 0.1472              | 0.1783                     | 0.2400            | 0.5135             | 0.3946                | [[32168, 15482], [6722, 6884]]  | 0.1472          |
| 0.0818        | 13.0  | 442  | 0.2861              | 0.2445                     | 0.3090            | 0.5556             | 0.4658                | [[37650, 10000], [6500, 7106]]  | 0.2861          |
| 0.0877        | 14.0  | 476  | 0.1159              | 0.1723                     | 0.2474            | 0.4934             | 0.3621                | [[32834, 14816], [7496, 6110]]  | 0.1159          |
| 0.0849        | 15.0  | 510  | 0.1053              | 0.1092                     | 0.1086            | 0.6293             | 0.4935                | [[13498, 34152], [1182, 12424]] | 0.1053          |
| 0.0927        | 16.0  | 544  | 0.1119              | 0.1599                     | 0.2309            | 0.4880             | 0.3674                | [[31812, 15838], [7216, 6390]]  | 0.1119          |
| 0.0794        | 17.0  | 578  | 0.2050              | 0.2036                     | 0.2405            | 0.5662             | 0.4462                | [[31942, 15708], [5590, 8016]]  | 0.2050          |
| 0.0888        | 18.0  | 612  | 0.2215              | 0.2213                     | 0.3139            | 0.5143             | 0.4002                | [[38766, 8884], [7984, 5622]]   | 0.2215          |
| 0.0815        | 19.0  | 646  | 0.1725              | 0.1935                     | 0.2815            | 0.4997             | 0.3833                | [[35946, 11704], [7668, 5938]]  | 0.1725          |
| 0.0796        | 20.0  | 680  | 0.1048              | 0.1103                     | 0.1072            | 0.6399             | 0.4934                | [[13442, 34208], [1176, 12430]] | 0.1048          |


### Framework versions

- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0