NonMatchingSplitsSizeError

#2
by yuyang-xue-ed - opened

Hello, there is an error when loading data. I try to re-download the data but it still the same

>>> dataset = load_dataset("OPTML-Group/UnlearnCanvas")
Resolving data files: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 331/331 [00:00<00:00, 768.19it/s]
Downloading data: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 331/331 [00:00<00:00, 1601.78files/s]
Generating train split: 52745 examples [24:56, 35.25 examples/s]
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "site-packages/datasets/load.py", line 2609, in load_dataset
    builder_instance.download_and_prepare(
  File "site-packages/datasets/builder.py", line 1027, in download_and_prepare
    self._download_and_prepare(
  File "/site-packages/datasets/builder.py", line 1140, in _download_and_prepare
    verify_splits(self.info.splits, split_dict)
  File "site-packages/datasets/utils/info_utils.py", line 101, in verify_splits
    raise NonMatchingSplitsSizesError(str(bad_splits))
datasets.utils.info_utils.NonMatchingSplitsSizesError: [{'expected': SplitInfo(name='train', num_bytes=76080381824.0, num_examples=24400, shard_lengths=None, dataset_name=None), 'recorded': SplitInfo(name='train', num_bytes=167271171931, num_examples=52745, shard_lengths=[160, 320, 320, 160, 160, 160, 320, 160, 160, 160, 320, 320, 320, 320, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 320, 320, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 320, 320, 320, 320, 320, 320, 320, 320, 320, 160, 160, 160, 160, 160, 160, 320, 320, 320, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 320, 320, 320, 160, 160, 160, 160, 160, 160, 160, 160, 160, 318, 318, 318, 159, 159, 159, 159, 159, 159, 159, 159, 159, 318, 318, 318, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 318, 318, 318, 159, 159, 159, 159, 159, 159, 159, 159, 159, 318, 318, 318, 318, 318, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 318, 318, 318, 318, 318, 318, 318, 318, 159, 159, 159, 159, 159, 318, 318, 318, 318, 318, 318, 318, 318, 159, 159, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 159, 159, 159, 159, 159, 318, 159, 159, 159, 159, 159, 159, 159, 159, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 159, 159, 159, 159], dataset_name='unlearn_canvas')}]

Does this mean the data from huggingface is not complete?

OPTML Group @ MSU org

Hi, thanks for your interest in our work. This is wierd, as I have tried to reproduce your error in different environments on my side but I did not encounter this problem. Can you please provide more details?

Hello, I tried again on the datasets.load, but still I got the error:

(new) [ic084yyx@cirrus-login1 small_diffusion_finetuning]$ python data.py 
Downloading readme: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 4.16k/4.16k [00:00<00:00, 2.61MB/s]
Resolving data files: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 331/331 [00:00<00:00, 1913.22it/s]
Downloading data: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 331/331 [1:54:48<00:00, 20.81s/files]
Generating train split: 52745 examples [12:42, 69.15 examples/s]                                             
Traceback (most recent call last):
  File "/scratch/space1/ic084/unlearning/finetuning/small_diffusion_finetuning/data.py", line 3, in <module>
    ds = load_dataset("OPTML-Group/UnlearnCanvas")
  File "/mnt/lustre/e1000/home/ic084/ic084/shared/team2/new/lib/python3.10/site-packages/datasets/load.py", line 2616, in load_dataset
    builder_instance.download_and_prepare(
  File "/mnt/lustre/e1000/home/ic084/ic084/shared/team2/new/lib/python3.10/site-packages/datasets/builder.py", line 1029, in download_and_prepare
    self._download_and_prepare(
  File "/mnt/lustre/e1000/home/ic084/ic084/shared/team2/new/lib/python3.10/site-packages/datasets/builder.py", line 1142, in _download_and_prepare
    verify_splits(self.info.splits, split_dict)
  File "/mnt/lustre/e1000/home/ic084/ic084/shared/team2/new/lib/python3.10/site-packages/datasets/utils/info_utils.py", line 77, in verify_splits
    raise NonMatchingSplitsSizesError(str(bad_splits))
datasets.exceptions.NonMatchingSplitsSizesError: [{'expected': SplitInfo(name='train', num_bytes=76080381824.0, num_examples=24400, shard_lengths=None, dataset_name=None), 'recorded': SplitInfo(name='train', num_bytes=167271171931, num_examples=52745, shard_lengths=[160, 320, 320, 160, 160, 160, 320, 160, 160, 160, 320, 320, 320, 320, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 320, 320, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 320, 320, 320, 320, 320, 320, 320, 320, 320, 160, 160, 160, 160, 160, 160, 320, 320, 320, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 160, 320, 320, 320, 160, 160, 160, 160, 160, 160, 160, 160, 160, 318, 318, 318, 159, 159, 159, 159, 159, 159, 159, 159, 159, 318, 318, 318, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 318, 318, 318, 159, 159, 159, 159, 159, 159, 159, 159, 159, 318, 318, 318, 318, 318, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 159, 318, 318, 318, 318, 318, 318, 318, 318, 159, 159, 159, 159, 159, 318, 318, 318, 318, 318, 318, 318, 318, 159, 159, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 159, 159, 159, 159, 159, 318, 159, 159, 159, 159, 159, 159, 159, 159, 318, 318, 318, 318, 318, 318, 318, 318, 318, 318, 159, 159, 159, 159], dataset_name='unlearn_canvas')}]

Sign up or log in to comment