Extremely slow download due to tiny files
Downloading this dataset takes a very long time (~12 hours), even with a very good internet connection (in the same amount of time the day before, I downloaded ~2TiB). I suspect this is because the dataset is composed of many very small files (The last few file sizes I saw were 27.7k, 30.1k, 23.8k, 16.2k, 29.6k, 12.5k, 27.8k, 17.0k, 24.2k, 36.9k, 29.7k, 50.2k, 28.8k, 13.6k, 14.2k, 34.9k
).
I don't know why the dataset is setup like this, but I think it would be much more efficient to have fewer, yet larger files.
Downloading this dataset takes a very long time (~12 hours), even with a very good internet connection (in the same amount of time the day before, I downloaded ~2TiB). I suspect this is because the dataset is composed of many very small files (The last few file sizes I saw were
27.7k, 30.1k, 23.8k, 16.2k, 29.6k, 12.5k, 27.8k, 17.0k, 24.2k, 36.9k, 29.7k, 50.2k, 28.8k, 13.6k, 14.2k, 34.9k
).I don't know why the dataset is setup like this, but I think it would be much more efficient to have fewer, yet larger files.
I meet the same problem! I even do not know how many tiny files there are here!
@SinclairWang
I found you can download it much quicker if you don't use HuggingFace's datasets
library to download it, but instead use git lfs
:
git lfs install
git clone https://huggingface.co./datasets/cerebras/SlimPajama-627B
Thanks!!! Your tips save my day!
@kaden-uhlig @SinclairWang hey guys could you give me an estimate of how much TB does the total dataset amount to ?
Size of compressed dataset:
895 GB
@SinclairWang @kaden-uhlig can you please also share a script or code to load the dataset after downloading with git lfs?
But when I check the size of the downloaded dataset is it only 125MB for val/test and 234MB for train? That can't be right?