Datasets:

Modalities:
Text
Formats:
parquet
ArXiv:
Libraries:
Datasets
Dask
License:
safikhan commited on
Commit
8d04741
1 Parent(s): cb8d8a9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -8
README.md CHANGED
@@ -176,7 +176,7 @@ Sangraha is the largest high-quality, cleaned Indic language pretraining data co
176
 
177
  **More information**:
178
 
179
- - For detailed information on the curation and cleaning process of Sangraha, please checkout our paper [on Arxiv](https://arxiv.org/);
180
  - Check out the scraping and cleaning pipelines used to curate Sangraha [on GitHub](https://github.com/AI4Bharat/IndicLLMSuite);
181
 
182
  ## Getting Started
@@ -226,12 +226,12 @@ Sangraha contains three broad components:
226
  To cite Sangraha, please use:
227
 
228
  ```
229
- @misc{cerebras2023slimpajama,
230
- author = {Soboleva, Daria and Al-Khateeb, Faisal and Myers, Robert and Steeves, Jacob R and Hestness, Joel and Dey, Nolan},
231
- title = {{SlimPajama: A 627B token cleaned and deduplicated version of RedPajama}},
232
- month = June,
233
- year = 2023,
234
- howpublished = {\url{https://www.cerebras.net/blog/slimpajama-a-627b-token-cleaned-and-deduplicated-version-of-redpajama}},
235
- url = {https://huggingface.co/datasets/cerebras/SlimPajama-627B},
236
  }
237
  ```
 
176
 
177
  **More information**:
178
 
179
+ - For detailed information on the curation and cleaning process of Sangraha, please checkout our paper [on Arxiv](https://arxiv.org/abs/2403.06350);
180
  - Check out the scraping and cleaning pipelines used to curate Sangraha [on GitHub](https://github.com/AI4Bharat/IndicLLMSuite);
181
 
182
  ## Getting Started
 
226
  To cite Sangraha, please use:
227
 
228
  ```
229
+ @misc{khan2024indicllmsuite,
230
+ title={IndicLLMSuite: A Blueprint for Creating Pre-training and Fine-Tuning Datasets for Indian Languages},
231
+ author={Mohammed Safi Ur Rahman Khan and Priyam Mehta and Ananth Sankar and Umashankar Kumaravelan and Sumanth Doddapaneni and Suriyaprasaad G and Varun Balan G and Sparsh Jain and Anoop Kunchukuttan and Pratyush Kumar and Raj Dabre and Mitesh M. Khapra},
232
+ year={2024},
233
+ eprint={2403.06350},
234
+ archivePrefix={arXiv},
235
+ primaryClass={cs.CL}
236
  }
237
  ```