Datasets:
Add doc counts for Dolma 1.7
Browse files
README.md
CHANGED
@@ -39,10 +39,10 @@ At the moment, there are six versions of Dolma available:
|
|
39 |
|
40 |
| **Version** | **Default?** | **Release Date** | **Size** (gzip) | **Description** |
|
41 |
|--|:--:|--|--|--|
|
42 |
-
| `v1_7` | ✅ | 2024-04-15 | 4.5 TB | Used to train [OLMo-7B-v1.7](https://huggingface.co/allenai/OLMo-7b-v1.7). |
|
43 |
-
| `v1_6` | | 2024-01-31 | 5.4 TB | An update to v1.5 with some
|
44 |
| `v1_6-sample` | | 2024-01-31 | 16.4 GB | A smaller sample of Dolma, with roughly 10 billion tokens. Useful for data exploration. |
|
45 |
-
| `v1_5` | | 2023-10-31 | 6.4 TB |
|
46 |
| `v1_5-sample` | | 2023-10-31 | 2.9 TB | A sample of roughly 1.9 trillion tokens used to train [OLMo-7B](https://huggingface.co/allenai/OLMo-7B) |
|
47 |
| `v1` | | 2023-08-18 | 6.0 TB | The first version of Dolma. |
|
48 |
|
@@ -66,7 +66,7 @@ At the moment, there are six versions of Dolma available:
|
|
66 |
| Project Gutenberg | [Project Gutenberg](https://www.gutenberg.org) via Dolma v1.6 | No | 0.0556 | 5.3 | 100% | Mar 2023 | Same as Dolma v1.6 |
|
67 |
| MegaWika | [MetaWika](https://huggingface.co/datasets/hltcoe/megawika) | Yes | 3.2 | 4.6 | 100% | Jul 2023 | English web pages cited from Wikipedia; curated using the full Dolma pipeline. |
|
68 |
| Wikipedia & Wikibooks | [Wikimedia](https://dumps.wikimedia.org) via Dolma v1.6 | No | 6.2 | 3.7 | 200% | Mar 2023 | Same as Dolma v1.6 |
|
69 |
-
| **Total** | | |
|
70 |
|
71 |
(A subset of total data was used for training of OLMo 7B-v1.7. The token counts are based on the full dataset, whereas taking into account sampling proportion gives the final actual token counts used for training --- 1.715 trillion tokens.)
|
72 |
|
@@ -85,9 +85,6 @@ At the moment, there are six versions of Dolma available:
|
|
85 |
| **Total** | | **11,519** | **4,367** | **2,318** | **3,059** |
|
86 |
|
87 |
|
88 |
-
(Size difference between `v1_6` and `v1_5` is due to different set of metadata included in files: we removed redundant metadata in `v1_6`.)
|
89 |
-
|
90 |
-
|
91 |
## Download
|
92 |
|
93 |
The fastest way to download Dolma is to clone this repository and use the files in the `url` directory.
|
|
|
39 |
|
40 |
| **Version** | **Default?** | **Release Date** | **Size** (gzip) | **Description** |
|
41 |
|--|:--:|--|--|--|
|
42 |
+
| `v1_7` | ✅ | 2024-04-15 | 4.5 TB | Used to train [OLMo-7B-v1.7](https://huggingface.co/allenai/OLMo-7b-v1.7). New sources, more quality filtering, fuzzy deduplication. |
|
43 |
+
| `v1_6` | | 2024-01-31 | 5.4 TB | An update to v1.5 with some deduplication of documents with too few tokens or too many repeated n-grams. |
|
44 |
| `v1_6-sample` | | 2024-01-31 | 16.4 GB | A smaller sample of Dolma, with roughly 10 billion tokens. Useful for data exploration. |
|
45 |
+
| `v1_5` | | 2023-10-31 | 6.4 TB | Used to train [OLMo-1B](https://huggingface.co/allenai/OLMo-1B). Roughly 3 trillion tokens. |
|
46 |
| `v1_5-sample` | | 2023-10-31 | 2.9 TB | A sample of roughly 1.9 trillion tokens used to train [OLMo-7B](https://huggingface.co/allenai/OLMo-7B) |
|
47 |
| `v1` | | 2023-08-18 | 6.0 TB | The first version of Dolma. |
|
48 |
|
|
|
66 |
| Project Gutenberg | [Project Gutenberg](https://www.gutenberg.org) via Dolma v1.6 | No | 0.0556 | 5.3 | 100% | Mar 2023 | Same as Dolma v1.6 |
|
67 |
| MegaWika | [MetaWika](https://huggingface.co/datasets/hltcoe/megawika) | Yes | 3.2 | 4.6 | 100% | Jul 2023 | English web pages cited from Wikipedia; curated using the full Dolma pipeline. |
|
68 |
| Wikipedia & Wikibooks | [Wikimedia](https://dumps.wikimedia.org) via Dolma v1.6 | No | 6.2 | 3.7 | 200% | Mar 2023 | Same as Dolma v1.6 |
|
69 |
+
| **Total** | | | **2532.0** | **2,308.5** | **1,715.1** | **Oct 2023** | |
|
70 |
|
71 |
(A subset of total data was used for training of OLMo 7B-v1.7. The token counts are based on the full dataset, whereas taking into account sampling proportion gives the final actual token counts used for training --- 1.715 trillion tokens.)
|
72 |
|
|
|
85 |
| **Total** | | **11,519** | **4,367** | **2,318** | **3,059** |
|
86 |
|
87 |
|
|
|
|
|
|
|
88 |
## Download
|
89 |
|
90 |
The fastest way to download Dolma is to clone this repository and use the files in the `url` directory.
|