Datasets:
Does this dataset contain all the data from Wikipedia?
#57
by
TiamoLee
- opened
For example, the en data is only 11GB in Parquet format. Is this the latest updated data or the full data set?
yes, the size is smaller because of preprocessing
I have a similar question since I noticed that it is written in the README that it is a train "subset" for each language. Is it just some sampling of wikipedia pages or all should be there? I would need all the pages on medical subjects. I am planning to filter the rest out for some application.
The dataset contains all the Wikipedia pages at the date of the dump.
Each date-language subset contains a single "train" split.
albertvillanova
changed discussion status to
closed