Upload 254 files
5e9cd1d
verified
-
PY3
Upload 254 files
-
8.57 kB
Upload 254 files
czech.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.27 MB
Upload 254 files
danish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.26 MB
Upload 254 files
dutch.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
743 kB
Upload 254 files
english.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
433 kB
Upload 254 files
estonian.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.6 MB
Upload 254 files
finnish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.95 MB
Upload 254 files
french.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
583 kB
Upload 254 files
german.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.53 MB
Upload 254 files
greek.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.95 MB
Upload 254 files
italian.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
658 kB
Upload 254 files
malayalam.pickle
Detected Pickle imports (7)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
221 kB
Upload 254 files
norwegian.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.26 MB
Upload 254 files
polish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
2.04 MB
Upload 254 files
portuguese.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
649 kB
Upload 254 files
russian.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.long",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
33 kB
Upload 254 files
slovene.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
833 kB
Upload 254 files
spanish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
598 kB
Upload 254 files
swedish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.03 MB
Upload 254 files
turkish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.23 MB
Upload 254 files