|
# Train |
|
|
|
## Environment |
|
|
|
```bash |
|
cd scripts |
|
python -m venv venv |
|
source venv/bin/activate |
|
pip install -U -r requirements.in |
|
``` |
|
|
|
## Train Tokenizer |
|
|
|
```bash |
|
time python -B train_tokenizer.py |
|
``` |
|
|
|
Tokenizer training log: |
|
``` |
|
Resolving data files: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 132/132 [00:00<00:00, 400.12it/s] |
|
Loading dataset shards: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 18/18 [00:00<00:00, 430.44it/s] |
|
Resolving data files: 100%|βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 133/133 [00:00<00:00, 306506.83it/s] |
|
[00:21:54] Pre-processing sequences ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ 0 / 0[00:00:48] Tokenize words ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ 25635525 / 25635525 |
|
[00:01:13] Count pairs ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ 25635525 / 25635525 |
|
[00:06:35] Compute merges ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ 48450 / 48450 |
|
|
|
________________________________________________________ |
|
Executed in 31.06 mins fish external |
|
usr time 359.27 mins 883.00 micros 359.27 mins |
|
sys time 6.64 mins 0.00 micros 6.64 mins |
|
|
|
``` |
|
|
|
## Pretrain |
|
|
|
```bash |
|
python -B prepare_pretrain_dataset.py |
|
``` |
|
|
|
```bash |
|
CUDA_VISIBLE_DEVICES=0 CUDA_LAUNCH_BLOCKING=0 PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True litgpt pretrain --config pretrain-model.yaml |
|
``` |
|
|
|
## Chat with Pretrained model |
|
|
|
```bash |
|
PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True CUDA_VISIBLE_DEVICES="0" litgpt chat out/pretrain/final/ |
|
``` |
|
|
|
## Model |
|
|
|
### Pretraining |
|
|
|
```bash |
|
litgpt pretrain --config ./pretrain-model.yaml |
|
litgpt convert_from_litgpt out/pretrain/final/ out/converted_pretrain |
|
cp config.json out/pretrain/final/ |
|
cp config.json out/converted_pretrain/ |
|
``` |
|
|
|
```python |
|
import torch |
|
from safetensors.torch import save_file |
|
|
|
state_dict = torch.load('out/converted_pretrain/model.pth', map_location='cpu') |
|
save_file(state_dict, 'out/converted_pretrain/model.safetensors') |
|
``` |
|
|
|
### Continued Pretraining |
|
|
|
```bash |
|
litgpt convert_pretrained_checkpoint out/pretrain/final/ out/pretrain_checkpoint/final/ |
|
cp config.json out/pretrain_checkpoint/final/ |
|
|
|
litgpt pretrain --config ./contrain-model.yaml |
|
litgpt convert_from_litgpt out/contrain/final/ out/converted_contrain |
|
cp config.json out/converted_contrain/ |
|
``` |
|
|
|
```python |
|
import torch |
|
from safetensors.torch import save_file |
|
|
|
state_dict = torch.load('out/converted_contrain/model.pth', map_location='cpu') |
|
save_file(state_dict, 'out/converted_contrain/model.safetensors') |
|
``` |
|
|
|
```bash |
|
cp out/converted_contrain/model.pth ./ |
|
cp out/converted_contrain/model.safetensors ./ |
|
``` |
|
|