File size: 6,089 Bytes
4ffe9fe
 
 
 
 
 
 
 
 
 
 
9761689
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4ffe9fe
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
---
task_categories:
- feature-extraction
---

## Model Description



As part of the ITANONG project's 10 billion-token Tagalog dataset, we have introduced a collection of pre-trained embedding models. These models were trained using the Formal text dataset from the renowned corpus which has been thoroughly detailed in our paper. Details of the embedding models can be seen below:

| **Embedding Technique** | **Variant** | **Model File Format** | **Embedding Size** |
|:-----------------------:|:-----------:|:---------------------:|:------------------:|
|         Word2Vec        |   Skipgram  |          .bin         |         20         |
|         Word2Vec        |   Skipgram  |          .bin         |         30         |
|         Word2Vec        |   Skipgram  |          .bin         |         50         |
|         Word2Vec        |   Skipgram  |          .bin         |         100        |
|         Word2Vec        |   Skipgram  |          .bin         |         200        |
|         Word2Vec        |   Skipgram  |          .bin         |         300        |
|         Word2Vec        |   Skipgram  |          .txt         |         20         |
|         Word2Vec        |   Skipgram  |          .txt         |         30         |
|         Word2Vec        |   Skipgram  |          .txt         |         50         |
|         Word2Vec        |   Skipgram  |          .txt         |         100        |
|         Word2Vec        |   Skipgram  |          .txt         |         200        |
|         Word2Vec        |   Skipgram  |          .txt         |         300        |
|         Word2Vec        |     CBOW    |          .bin         |         20         |
|         Word2Vec        |     CBOW    |          .bin         |         30         |
|         Word2Vec        |     CBOW    |          .bin         |         50         |
|         Word2Vec        |     CBOW    |          .bin         |         100        |
|         Word2Vec        |     CBOW    |          .bin         |         200        |
|         Word2Vec        |     CBOW    |          .bin         |         300        |
|         Word2Vec        |     CBOW    |          .txt         |         20         |
|         Word2Vec        |     CBOW    |          .txt         |         30         |
|         Word2Vec        |     CBOW    |          .txt         |         50         |
|         Word2Vec        |     CBOW    |          .txt         |         100        |
|         Word2Vec        |     CBOW    |          .txt         |         200        |
|         Word2Vec        |     CBOW    |          .txt         |         300        |
|         FastText        |   Skipgram  |          .bin         |         20         |
|         FastText        |   Skipgram  |          .bin         |         30         |
|         FastText        |   Skipgram  |          .bin         |         50         |
|         FastText        |   Skipgram  |          .bin         |         100        |
|         FastText        |   Skipgram  |          .bin         |         200        |
|         FastText        |   Skipgram  |          .bin         |         300        |
|         FastText        |   Skipgram  |          .txt         |         20         |
|         FastText        |   Skipgram  |          .txt         |         30         |
|         FastText        |   Skipgram  |          .txt         |         50         |
|         FastText        |   Skipgram  |          .txt         |         100        |
|         FastText        |   Skipgram  |          .txt         |         200        |
|         FastText        |   Skipgram  |          .txt         |         300        |
|         FastText        |     CBOW    |          .bin         |         20         |
|         FastText        |     CBOW    |          .bin         |         30         |
|         FastText        |     CBOW    |          .bin         |         50         |
|         FastText        |     CBOW    |          .bin         |         100        |
|         FastText        |     CBOW    |          .bin         |         200        |
|         FastText        |     CBOW    |          .bin         |         300        |
|         FastText        |     CBOW    |          .txt         |         20         |
|         FastText        |     CBOW    |          .txt         |         30         |
|         FastText        |     CBOW    |          .txt         |         50         |
|         FastText        |     CBOW    |          .txt         |         100        |
|         FastText        |     CBOW    |          .txt         |         200        |
|         FastText        |     CBOW    |          .txt         |         300        |



## Training Details
This model was trained using an Nvidia V100-32GB GPU on DOST-ASTI Computing and Archiving Research Environment (COARE) - https://asti.dost.gov.ph/projects/coare/

### Training Data
The training dataset was compiled from both formal and informal sources, consisting of 194,001 instances from formal channels. More information on pre-processing and training parameters on our paper.

## Citation
Paper : iTANONG-DS : A Collection of Benchmark Datasets for Downstream Natural Language Processing Tasks on Select Philippine Language


Bibtex:
```
@inproceedings{visperas-etal-2023-itanong,
    title = "i{TANONG}-{DS} : A Collection of Benchmark Datasets for Downstream Natural Language Processing Tasks on Select {P}hilippine Languages",
    author = "Visperas, Moses L.  and
      Borjal, Christalline Joie  and
      Adoptante, Aunhel John M  and
      Abacial, Danielle Shine R.  and
      Decano, Ma. Miciella  and
      Peramo, Elmer C",
    editor = "Abbas, Mourad  and
      Freihat, Abed Alhakim",
    booktitle = "Proceedings of the 6th International Conference on Natural Language and Speech Processing (ICNLSP 2023)",
    month = dec,
    year = "2023",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2023.icnlsp-1.34",
    pages = "316--323",
}
```