Datasets:
Can't load `PolyAI/minds14` anymore
Hi! Since the latest change (March 30), the following code is no longer working as before. Do you know how to make it again, please? Thank you!
from datasets import load_dataset
dataset = load_dataset("PolyAI/minds14", name="en-US", split="train") # doctest: +IGNORE_RESULT
The error I got is
FileNotFoundError: Local file /root/.cache/huggingface/datasets/downloads/0612aba7d917bf92e6597c0ee8627f665afb33e56fd7dff2d72b81f0877dd3f8/MInDS-14/audio.zip doesn't exist
well, same problem.
Same problem, too.
same problem.
Sorry, we had to change the download link - updated to another link
The new link returns a connection error:ConnectionError: Couldn't reach https://www.dropbox.com/s/e2us0hcs3ilr20e/MInDS-14.zip?dl=1 (error 405)
Sorry, it's now working again
Hm. I tried again and got the same connection error.
I also tried the dropbox link directly in browser, here's what I got:
Link temporarily disabled. This can happen when the link has been shared or downloaded too many times in a day. Check back later and weβll open access to more people.
Ok, upgraded the account, I think the link is enabled again.
It seems the issue is back again today. The CI gets
106 ```py
107 >>> from datasets import load_dataset, Audio
108
109 >>> dataset = load_dataset("PolyAI/minds14", name="en-US", split="train") # doctest: +IGNORE_RESULT
UNEXPECTED EXCEPTION: ConnectionError("Couldn't reach https://www.dropbox.com/s/e2us0hcs3ilr20e/MInDS-14.zip?dl=1 (error 403)")
Traceback (most recent call last):
File "/usr/lib/python3.8/doctest.py", line 1336, in __run
exec(compile(example.source, filename, "single",
File "<doctest quicktour.mdx[9]>", line 1, in <module>
File "/usr/local/lib/python3.8/dist-packages/datasets/load.py", line 1791, in load_dataset
builder_instance.download_and_prepare(
File "/usr/local/lib/python3.8/dist-packages/datasets/builder.py", line 891, in download_and_prepare
self._download_and_prepare(
File "/usr/local/lib/python3.8/dist-packages/datasets/builder.py", line 1651, in _download_and_prepare
super()._download_and_prepare(
File "/usr/local/lib/python3.8/dist-packages/datasets/builder.py", line 964, in _download_and_prepare
split_generators = self._split_generators(dl_manager, **split_generators_kwargs)
File "/root/.cache/huggingface/modules/datasets_modules/datasets/PolyAI--minds14/65c7e0f3be79e18a6ffaf879a083daf706312d421ac90d25718459cbf3c42696/minds14.py", line 132, in _split_generators
archive_path = dl_manager.download_and_extract(self.config.data_url)
File "/usr/local/lib/python3.8/dist-packages/datasets/download/download_manager.py", line 564, in download_and_extract
return self.extract(self.download(url_or_urls))
File "/usr/local/lib/python3.8/dist-packages/datasets/download/download_manager.py", line 427, in download
downloaded_path_or_paths = map_nested(
File "/usr/local/lib/python3.8/dist-packages/datasets/utils/py_utils.py", line 435, in map_nested
return function(data_struct)
File "/usr/local/lib/python3.8/dist-packages/datasets/download/download_manager.py", line 453, in _download
return cached_path(url_or_filename, download_config=download_config)
File "/usr/local/lib/python3.8/dist-packages/datasets/utils/file_utils.py", line 183, in cached_path
output_path = get_from_cache(
File "/usr/local/lib/python3.8/dist-packages/datasets/utils/file_utils.py", line 568, in get_from_cache
raise ConnectionError(f"Couldn't reach {url} (error {response.status_code})")
ConnectionError: Couldn't reach https://www.dropbox.com/s/e2us0hcs3ilr20e/MInDS-14.zip?dl=1 (error 403)
Could you try again? The link seems to be working
It's broken for me still (exact same error as @ydshieh ). It does work if I visit the link in a browser.
Update: looks to be fixed now! Thanks!
Fails for me now: ConnectionError: Couldn't reach https://www.dropbox.com/s/e2us0hcs3ilr20e/MInDS-14.zip?dl=1 (error 403)
, confirmed that I can reach it with a browser.
I am having the same issue: ConnectionError: Couldn't reach https://www.dropbox.com/s/e2us0hcs3ilr20e/MInDS-14.zip?dl=1 (error 403)
Same problem, too.
ConnectionError: Couldn't reach https://www.dropbox.com/s/e2us0hcs3ilr20e/MInDS-14.zip?dl=1 (SSLError(MaxRetryError("HTTPSConnectionPool(host='www.dropbox.com', port=443): Max retries exceeded with url: /s/e2us0hcs3ilr20e/MInDS-14.zip?dl=1 (Caused by SSLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:1131)')))")))
I have the same issue:
ConnectionError: Couldn't reach https://www.dropbox.com/s/e2us0hcs3ilr20e/MInDS-14.zip?dl=1 (error 429)
and when opened in a browser it shows the following message:
Link temporarily disabled
This can happen when the link has been shared or downloaded too many times in a day. Check back later and weβll open access to more people.
Hi! Since 10 days ago, the following code is no longer working as before. Do you know how to make it again, please? Thank you!
"from datasets import load_dataset
dataset = load_dataset("PolyAI/minds14", name="en-US", split="train")"
Hi! There is same issue now.
ConnectionError: Couldn't reach https://www.dropbox.com/s/e2us0hcs3ilr20e/MInDS-14.zip?dl=1 (error 429)
When I visit the link, it shows the message:
Link temporarily disabled
This can happen when the link has been shared or downloaded too many times in a day. Check back later and weβll open access to more people. Learn more.
Hi there, still the same issue :'(
Is it possible to refresh the link and receive a feedback
or to host the ds somewhere else?
Thx in advance <3
GA
@gantonacci95 , You can try
# download to disk
!huggingface-cli download PolyAI/minds14 --repo-type dataset --revision refs/convert/parquet --local-dir . --local-dir-use-symlinks False --include 'en-US/*'
from datasets import load_dataset
dataset = load_dataset('./en-US', split="train")
Hello I'm having a similar issue with the dropbox url. The error I get is:
ConnectionError: Couldn't reach https://www.dropbox.com/s/e2us0hcs3ilr20e/MInDS-14.zip?dl=1 (error 429)
And going to the url directly it seems this is being blocked by dropbox itself.
I don't have the context, I'm curious if there are other options for hosting large files. Maybe an AWS bucket?
I just started HF's audio course and having the same problem with connection errors when attempting to download the dataset
Same problem too
It's still an Issue. Please advice on how to proceed
I just ran below example on google colab and it works (datasets==2.16.0
)
!pip show datasets
from datasets import load_dataset
dataset = load_dataset("PolyAI/minds14", name="en-US", split="train")
print(dataset)
I think it is probably due to the host server has some limitation.
@dgerz Do you think it's possible to host the dataset files on the Hub directly? It will definitely ease eveyone's life + this dataset gets more usage and impact :-)
I had the same problem. But I was trying to run "load_dataset("PolyAI/minds14", "en-US")" every few hours and it randomly worked, But it still says 'link temporarily disabled' in the browser. So, people with this issue keep trying once every few hours.