Update README.md
Browse files
README.md
CHANGED
@@ -9,105 +9,58 @@ task_categories:
|
|
9 |
|
10 |
As part of the ITANONG project's 10 billion-token Tagalog dataset, we have introduced a collection of pre-trained embedding models. These models were trained using the Formal text dataset from the renowned corpus which has been thoroughly detailed in our paper. Details of the embedding models can be seen below:
|
11 |
|
12 |
-
|
13 |
-
|
14 |
-
|
15 |
-
|
|
16 |
-
|
|
17 |
-
|
|
18 |
-
|
|
19 |
-
|
|
20 |
-
|
|
21 |
-
|
|
22 |
-
|
|
23 |
-
|
|
24 |
-
|
|
25 |
-
|
|
26 |
-
|
|
27 |
-
|
|
28 |
-
|
|
29 |
-
|
|
30 |
-
|
|
31 |
-
|
|
32 |
-
|
|
33 |
-
|
|
34 |
-
|
|
35 |
-
|
|
36 |
-
|
|
37 |
-
|
|
38 |
-
|
|
39 |
-
|
|
40 |
-
|
|
41 |
-
|
|
42 |
-
|
|
43 |
-
|
|
44 |
-
|
|
45 |
-
|
|
46 |
-
|
|
47 |
-
|
|
48 |
-
|
|
49 |
-
|
|
50 |
-
|
|
51 |
-
|
|
52 |
-
|
|
53 |
-
|
|
54 |
-
|
|
55 |
-
|
|
56 |
-
|
|
57 |
-
|
|
58 |
-
|
|
59 |
-
|
|
60 |
-
|
|
61 |
-
|
|
62 |
-
|
63 |
-
|
64 |
-
| | | +----------------+
|
65 |
-
| | | | 30 |
|
66 |
-
| | | +----------------+
|
67 |
-
| | | | 50 |
|
68 |
-
| | | +----------------+
|
69 |
-
| | | | 100 |
|
70 |
-
| | | +----------------+
|
71 |
-
| | | | 200 |
|
72 |
-
| | | +----------------+
|
73 |
-
| | | | 300 |
|
74 |
-
| | +-------------------+----------------+
|
75 |
-
| | | .txt | 20 |
|
76 |
-
| | | +----------------+
|
77 |
-
| | | | 30 |
|
78 |
-
| | | +----------------+
|
79 |
-
| | | | 50 |
|
80 |
-
| | | +----------------+
|
81 |
-
| | | | 100 |
|
82 |
-
| | | +----------------+
|
83 |
-
| | | | 200 |
|
84 |
-
| | | +----------------+
|
85 |
-
| | | | 300 |
|
86 |
-
| +----------+-------------------+----------------+
|
87 |
-
| | CBOW | .bin | 20 |
|
88 |
-
| | | +----------------+
|
89 |
-
| | | | 30 |
|
90 |
-
| | | +----------------+
|
91 |
-
| | | | 50 |
|
92 |
-
| | | +----------------+
|
93 |
-
| | | | 100 |
|
94 |
-
| | | +----------------+
|
95 |
-
| | | | 200 |
|
96 |
-
| | | +----------------+
|
97 |
-
| | | | 300 |
|
98 |
-
| | +-------------------+----------------+
|
99 |
-
| | | .txt | 20 |
|
100 |
-
| | | +----------------+
|
101 |
-
| | | | 30 |
|
102 |
-
| | | +----------------+
|
103 |
-
| | | | 50 |
|
104 |
-
| | | +----------------+
|
105 |
-
| | | | 100 |
|
106 |
-
| | | +----------------+
|
107 |
-
| | | | 200 |
|
108 |
-
| | | +----------------+
|
109 |
-
| | | | 300 |
|
110 |
-
+---------------------+----------+-------------------+----------------+
|
111 |
|
112 |
## Training Details
|
113 |
This model was trained using an Nvidia V100-32GB GPU on DOST-ASTI Computing and Archiving Research Environment (COARE) - https://asti.dost.gov.ph/projects/coare/
|
|
|
9 |
|
10 |
As part of the ITANONG project's 10 billion-token Tagalog dataset, we have introduced a collection of pre-trained embedding models. These models were trained using the Formal text dataset from the renowned corpus which has been thoroughly detailed in our paper. Details of the embedding models can be seen below:
|
11 |
|
12 |
+
| **Embedding Technique** | **Variant** | **Model File Format** | **Embedding Size** |
|
13 |
+
|:-----------------------:|:-----------:|:---------------------:|:------------------:|
|
14 |
+
| Word2Vec | Skipgram | .bin | 20 |
|
15 |
+
| Word2Vec | Skipgram | .bin | 30 |
|
16 |
+
| Word2Vec | Skipgram | .bin | 50 |
|
17 |
+
| Word2Vec | Skipgram | .bin | 100 |
|
18 |
+
| Word2Vec | Skipgram | .bin | 200 |
|
19 |
+
| Word2Vec | Skipgram | .bin | 300 |
|
20 |
+
| Word2Vec | Skipgram | .txt | 20 |
|
21 |
+
| Word2Vec | Skipgram | .txt | 30 |
|
22 |
+
| Word2Vec | Skipgram | .txt | 50 |
|
23 |
+
| Word2Vec | Skipgram | .txt | 100 |
|
24 |
+
| Word2Vec | Skipgram | .txt | 200 |
|
25 |
+
| Word2Vec | Skipgram | .txt | 300 |
|
26 |
+
| Word2Vec | CBOW | .bin | 20 |
|
27 |
+
| Word2Vec | CBOW | .bin | 30 |
|
28 |
+
| Word2Vec | CBOW | .bin | 50 |
|
29 |
+
| Word2Vec | CBOW | .bin | 100 |
|
30 |
+
| Word2Vec | CBOW | .bin | 200 |
|
31 |
+
| Word2Vec | CBOW | .bin | 300 |
|
32 |
+
| Word2Vec | CBOW | .txt | 20 |
|
33 |
+
| Word2Vec | CBOW | .txt | 30 |
|
34 |
+
| Word2Vec | CBOW | .txt | 50 |
|
35 |
+
| Word2Vec | CBOW | .txt | 100 |
|
36 |
+
| Word2Vec | CBOW | .txt | 200 |
|
37 |
+
| Word2Vec | CBOW | .txt | 300 |
|
38 |
+
| FastText | Skipgram | .bin | 20 |
|
39 |
+
| FastText | Skipgram | .bin | 30 |
|
40 |
+
| FastText | Skipgram | .bin | 50 |
|
41 |
+
| FastText | Skipgram | .bin | 100 |
|
42 |
+
| FastText | Skipgram | .bin | 200 |
|
43 |
+
| FastText | Skipgram | .bin | 300 |
|
44 |
+
| FastText | Skipgram | .txt | 20 |
|
45 |
+
| FastText | Skipgram | .txt | 30 |
|
46 |
+
| FastText | Skipgram | .txt | 50 |
|
47 |
+
| FastText | Skipgram | .txt | 100 |
|
48 |
+
| FastText | Skipgram | .txt | 200 |
|
49 |
+
| FastText | Skipgram | .txt | 300 |
|
50 |
+
| FastText | CBOW | .bin | 20 |
|
51 |
+
| FastText | CBOW | .bin | 30 |
|
52 |
+
| FastText | CBOW | .bin | 50 |
|
53 |
+
| FastText | CBOW | .bin | 100 |
|
54 |
+
| FastText | CBOW | .bin | 200 |
|
55 |
+
| FastText | CBOW | .bin | 300 |
|
56 |
+
| FastText | CBOW | .txt | 20 |
|
57 |
+
| FastText | CBOW | .txt | 30 |
|
58 |
+
| FastText | CBOW | .txt | 50 |
|
59 |
+
| FastText | CBOW | .txt | 100 |
|
60 |
+
| FastText | CBOW | .txt | 200 |
|
61 |
+
| FastText | CBOW | .txt | 300 |
|
62 |
+
|
63 |
+
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
64 |
|
65 |
## Training Details
|
66 |
This model was trained using an Nvidia V100-32GB GPU on DOST-ASTI Computing and Archiving Research Environment (COARE) - https://asti.dost.gov.ph/projects/coare/
|