Dependency Versions
I logged my experience following the old ReadMe.md on the discord channel help-each-other 04/22/23 at 4:33 am and noticed how impactful the dependency versions were when troubleshooting and wanted to start a conversation on any non-recommended versions people got working for them. For me, I managed to get the following working:
-Torch at v2.0.0, at 1.13.0 the script "ran" but gave zero output due to not actually running beyond the import torch statement.
-I uninstalled and reinstalled NumPy, Torch, Transformers, Tokenizers, Jax, Accelerate, SentencePiece and ArgParse. Then I ran xor_codec.py which worked perfectly!
My reinstalled dependency versions I used were:
-Accelerate: 0.18.0
-ArgParse: 1.4.0
-Jax: 0.4.8
-NumPy: 1.24.2
-Tokenizers: 0.13.3
-Torch: 2.0.0
-Transformers (with llama): 4.28.0
-SentencePiece: 0.1.98
I tried this on Windows with all the recommended stuff except I used Torch v2.0.0. The .bin files turned out correctly, but .config files did not. Did the same but on Linux (all recommended requirements, except I used Torch 2.0.0) and it turned out fine. I know it's essentially an unwritten statement that Linux is required for most of this ML stuff since that's what everyone develops on, but figured I'd throw in my experience here in case anyone ran into that.
I tried this on Windows with all the recommended stuff except I used Torch v2.0.0. The .bin files turned out correctly, but .config files did not. Did the same but on Linux (all recommended requirements, except I used Torch 2.0.0) and it turned out fine. I know it's essentially an unwritten statement that Linux is required for most of this ML stuff since that's what everyone develops on, but figured I'd throw in my experience here in case anyone ran into that.
The only thing wrong on Windows when using the recommended versions is the line ending being CRLF instead of LF for the following 5 files. Just use Notepad++ (Edit => EOL Conversion => LF) to convert them to the proper line-ending and the hashes will match.
CRLF line-endings (Windows):
0535e243e10a8ecf23bc49ed3590ed10 *./config.json
d200625bb1bcee9e13e1ae30d248b146 *./generation_config.json
65d69efafc2ed43ec0a5e9b5c3f922c7 *./pytorch_model.bin.index.json
6e047cb153d02be1fe3d3e48ff2db212 *./special_tokens_map.json
a00577426803c15aaf3ae8b864d1e9e6 *./tokenizer_config.json
LF line-endings (Linux):
598538f18fed1877b41f77de034c0c8a *./config.json
aee09e21813368c49baaece120125ae3 *./generation_config.json
fecfda4fba7bfd911e187a85db5fa2ef *./pytorch_model.bin.index.json
6b2e0a735969660e720c27061ef3f3d3 *./special_tokens_map.json
edd1a5897748864768b1fab645b31491 *./tokenizer_config.json