Commit
β’
5e689a2
1
Parent(s):
ae4b074
End of training
Browse files
README.md
CHANGED
@@ -1,11 +1,13 @@
|
|
1 |
---
|
2 |
base_model: sanchit-gandhi/Mistral-7B-v0.1-6-layer
|
3 |
tags:
|
|
|
|
|
4 |
- trl
|
5 |
- sft
|
6 |
- generated_from_trainer
|
7 |
datasets:
|
8 |
-
-
|
9 |
model-index:
|
10 |
- name: sanchit-gandhi/Mistral-7B-v0.1-6-layer
|
11 |
results: []
|
@@ -16,7 +18,7 @@ should probably proofread and complete it, then remove this comment. -->
|
|
16 |
|
17 |
# sanchit-gandhi/Mistral-7B-v0.1-6-layer
|
18 |
|
19 |
-
This model is a fine-tuned version of [sanchit-gandhi/Mistral-7B-v0.1-6-layer](https://huggingface.co/sanchit-gandhi/Mistral-7B-v0.1-6-layer) on the
|
20 |
It achieves the following results on the evaluation set:
|
21 |
- Loss: 2.1183
|
22 |
|
|
|
1 |
---
|
2 |
base_model: sanchit-gandhi/Mistral-7B-v0.1-6-layer
|
3 |
tags:
|
4 |
+
- alignment-handbook
|
5 |
+
- generated_from_trainer
|
6 |
- trl
|
7 |
- sft
|
8 |
- generated_from_trainer
|
9 |
datasets:
|
10 |
+
- HuggingFaceH4/ultrachat_200k
|
11 |
model-index:
|
12 |
- name: sanchit-gandhi/Mistral-7B-v0.1-6-layer
|
13 |
results: []
|
|
|
18 |
|
19 |
# sanchit-gandhi/Mistral-7B-v0.1-6-layer
|
20 |
|
21 |
+
This model is a fine-tuned version of [sanchit-gandhi/Mistral-7B-v0.1-6-layer](https://huggingface.co/sanchit-gandhi/Mistral-7B-v0.1-6-layer) on the HuggingFaceH4/ultrachat_200k dataset.
|
22 |
It achieves the following results on the evaluation set:
|
23 |
- Loss: 2.1183
|
24 |
|
config.json
CHANGED
@@ -21,6 +21,6 @@
|
|
21 |
"tie_word_embeddings": false,
|
22 |
"torch_dtype": "bfloat16",
|
23 |
"transformers_version": "4.36.2",
|
24 |
-
"use_cache":
|
25 |
"vocab_size": 32000
|
26 |
}
|
|
|
21 |
"tie_word_embeddings": false,
|
22 |
"torch_dtype": "bfloat16",
|
23 |
"transformers_version": "4.36.2",
|
24 |
+
"use_cache": true,
|
25 |
"vocab_size": 32000
|
26 |
}
|
wandb/debug-internal.log
CHANGED
@@ -4501,3 +4501,9 @@
|
|
4501 |
2024-02-01 19:26:12,253 DEBUG HandlerThread:239784 [handler.py:handle_request():146] handle_request: stop_status
|
4502 |
2024-02-01 19:26:12,253 DEBUG SenderThread:239784 [sender.py:send_request():409] send_request: stop_status
|
4503 |
2024-02-01 19:26:13,626 INFO Thread-12 :239784 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft/wandb/run-20240201_175850-i93q0p12/files/output.log
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4501 |
2024-02-01 19:26:12,253 DEBUG HandlerThread:239784 [handler.py:handle_request():146] handle_request: stop_status
|
4502 |
2024-02-01 19:26:12,253 DEBUG SenderThread:239784 [sender.py:send_request():409] send_request: stop_status
|
4503 |
2024-02-01 19:26:13,626 INFO Thread-12 :239784 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft/wandb/run-20240201_175850-i93q0p12/files/output.log
|
4504 |
+
2024-02-01 19:26:15,629 INFO Thread-12 :239784 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft/wandb/run-20240201_175850-i93q0p12/files/output.log
|
4505 |
+
2024-02-01 19:26:16,138 DEBUG HandlerThread:239784 [handler.py:handle_request():146] handle_request: status_report
|
4506 |
+
2024-02-01 19:26:17,631 INFO Thread-12 :239784 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft/wandb/run-20240201_175850-i93q0p12/files/output.log
|
4507 |
+
2024-02-01 19:26:21,019 DEBUG SenderThread:239784 [sender.py:send():382] send: stats
|
4508 |
+
2024-02-01 19:26:21,637 INFO Thread-12 :239784 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft/wandb/run-20240201_175850-i93q0p12/files/output.log
|
4509 |
+
2024-02-01 19:26:22,020 DEBUG HandlerThread:239784 [handler.py:handle_request():146] handle_request: status_report
|
wandb/run-20240201_175850-i93q0p12/files/output.log
CHANGED
@@ -1731,3 +1731,19 @@ Training completed. Do not forget to share your model on huggingface.co/models =
|
|
1731 |
[INFO|tokenization_utils_base.py:2432] 2024-02-01 19:26:11,031 >> tokenizer config file saved in ./tokenizer_config.json
|
1732 |
[INFO|tokenization_utils_base.py:2441] 2024-02-01 19:26:11,033 >> Special tokens file saved in ./special_tokens_map.json
|
1733 |
[INFO|modelcard.py:452] 2024-02-01 19:26:11,224 >> Dropping the following result as it does not have all the necessary fields:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1731 |
[INFO|tokenization_utils_base.py:2432] 2024-02-01 19:26:11,031 >> tokenizer config file saved in ./tokenizer_config.json
|
1732 |
[INFO|tokenization_utils_base.py:2441] 2024-02-01 19:26:11,033 >> Special tokens file saved in ./special_tokens_map.json
|
1733 |
[INFO|modelcard.py:452] 2024-02-01 19:26:11,224 >> Dropping the following result as it does not have all the necessary fields:
|
1734 |
+
{'task': {'name': 'Causal Language Modeling', 'type': 'text-generation'}, 'dataset': {'name': 'generator', 'type': 'generator', 'config': 'default', 'split': 'train', 'args': 'default'}}
|
1735 |
+
events.out.tfevents.1706815561.ip-26-0-165-24.239318.1: 100%|ββββββββββββββββββββββββββ| 359/359 [00:00<00:00, 4.01kB/s]
|
1736 |
+
run-i93q0p12.wandb: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1.57M/1.57M [00:00<00:00, 11.3MB/s]
|
1737 |
+
Upload 2 LFS files: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 2/2 [00:00<00:00, 6.81it/s]
|
1738 |
+
[INFO|modelcard.py:452] 2024-02-01 19:26:15,132 >> Dropping the following result as it does not have all the necessary fields:
|
1739 |
+
{'task': {'name': 'Causal Language Modeling', 'type': 'text-generation'}, 'dataset': {'name': 'HuggingFaceH4/ultrachat_200k', 'type': 'HuggingFaceH4/ultrachat_200k', 'config': 'default', 'split': 'train', 'args': 'default'}}
|
1740 |
+
[INFO|configuration_utils.py:483] 2024-02-01 19:26:15,137 >> Configuration saved in ./config.json
|
1741 |
+
[INFO|trainer.py:2889] 2024-02-01 19:26:16,239 >> Saving model checkpoint to ./
|
1742 |
+
[INFO|configuration_utils.py:483] 2024-02-01 19:26:16,242 >> Configuration saved in ./config.json
|
1743 |
+
[INFO|configuration_utils.py:594] 2024-02-01 19:26:16,244 >> Configuration saved in ./generation_config.json
|
1744 |
+
2024-02-01 19:26:14 - INFO - __main__ - Model saved to ./
|
1745 |
+
2024-02-01 19:26:15 - INFO - __main__ - Pushing to hub...
|
1746 |
+
[INFO|modeling_utils.py:2382] 2024-02-01 19:26:19,933 >> Model weights saved in ./pytorch_model.bin
|
1747 |
+
[INFO|tokenization_utils_base.py:2432] 2024-02-01 19:26:19,936 >> tokenizer config file saved in ./tokenizer_config.json
|
1748 |
+
[INFO|tokenization_utils_base.py:2441] 2024-02-01 19:26:19,938 >> Special tokens file saved in ./special_tokens_map.json
|
1749 |
+
[INFO|modelcard.py:452] 2024-02-01 19:26:20,002 >> Dropping the following result as it does not have all the necessary fields:
|
wandb/run-20240201_175850-i93q0p12/logs/debug-internal.log
CHANGED
@@ -4501,3 +4501,9 @@
|
|
4501 |
2024-02-01 19:26:12,253 DEBUG HandlerThread:239784 [handler.py:handle_request():146] handle_request: stop_status
|
4502 |
2024-02-01 19:26:12,253 DEBUG SenderThread:239784 [sender.py:send_request():409] send_request: stop_status
|
4503 |
2024-02-01 19:26:13,626 INFO Thread-12 :239784 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft/wandb/run-20240201_175850-i93q0p12/files/output.log
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4501 |
2024-02-01 19:26:12,253 DEBUG HandlerThread:239784 [handler.py:handle_request():146] handle_request: stop_status
|
4502 |
2024-02-01 19:26:12,253 DEBUG SenderThread:239784 [sender.py:send_request():409] send_request: stop_status
|
4503 |
2024-02-01 19:26:13,626 INFO Thread-12 :239784 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft/wandb/run-20240201_175850-i93q0p12/files/output.log
|
4504 |
+
2024-02-01 19:26:15,629 INFO Thread-12 :239784 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft/wandb/run-20240201_175850-i93q0p12/files/output.log
|
4505 |
+
2024-02-01 19:26:16,138 DEBUG HandlerThread:239784 [handler.py:handle_request():146] handle_request: status_report
|
4506 |
+
2024-02-01 19:26:17,631 INFO Thread-12 :239784 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft/wandb/run-20240201_175850-i93q0p12/files/output.log
|
4507 |
+
2024-02-01 19:26:21,019 DEBUG SenderThread:239784 [sender.py:send():382] send: stats
|
4508 |
+
2024-02-01 19:26:21,637 INFO Thread-12 :239784 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft/wandb/run-20240201_175850-i93q0p12/files/output.log
|
4509 |
+
2024-02-01 19:26:22,020 DEBUG HandlerThread:239784 [handler.py:handle_request():146] handle_request: status_report
|
wandb/run-20240201_175850-i93q0p12/run-i93q0p12.wandb
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:861cad702742507ee3bc6bdf7695a0c667fb97b99178e244dc422c91a63feac2
|
3 |
+
size 1608059
|