File size: 29,658 Bytes
86a0486 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 |
Loading pytorch-gpu/py3/2.1.1
Loading requirement: cuda/11.8.0 nccl/2.18.5-1-cuda cudnn/8.7.0.84-cuda
gcc/8.5.0 openmpi/4.1.5-cuda intel-mkl/2020.4 magma/2.7.1-cuda sox/14.4.2
sparsehash/2.0.3 libjpeg-turbo/2.1.3 ffmpeg/4.4.4
+ HF_DATASETS_OFFLINE=1
+ TRANSFORMERS_OFFLINE=1
+ python3 deberta_training_multi.py
train:
DatasetInfo(description='', citation='', homepage='', license='', features={'metadata': Value(dtype='string', id=None), 'text': Value(dtype='string', id=None), '1_legislation': Value(dtype='int64', id=None), '10_journaux': Value(dtype='int64', id=None), '12_presentations': Value(dtype='int64', id=None), '13_lettres': Value(dtype='int64', id=None), '2_rapport_evaluation': Value(dtype='int64', id=None), '3_rapport_comptes': Value(dtype='int64', id=None), '4_rapport_activite': Value(dtype='int64', id=None), '5_rapport_risque': Value(dtype='int64', id=None), '6_plan': Value(dtype='int64', id=None), '7_charte': Value(dtype='int64', id=None), '__index_level_0__': Value(dtype='int64', id=None)}, post_processed=None, supervised_keys=None, task_templates=None, builder_name=None, dataset_name=None, config_name=None, version=None, splits=None, download_checksums=None, download_size=None, post_processing_size=None, dataset_size=None, size_in_bytes=None)
test:
DatasetInfo(description='', citation='', homepage='', license='', features={'metadata': Value(dtype='string', id=None), 'text': Value(dtype='string', id=None), '1_legislation': Value(dtype='int64', id=None), '10_journaux': Value(dtype='int64', id=None), '12_presentations': Value(dtype='int64', id=None), '13_lettres': Value(dtype='int64', id=None), '2_rapport_evaluation': Value(dtype='int64', id=None), '3_rapport_comptes': Value(dtype='int64', id=None), '4_rapport_activite': Value(dtype='int64', id=None), '5_rapport_risque': Value(dtype='int64', id=None), '6_plan': Value(dtype='int64', id=None), '7_charte': Value(dtype='int64', id=None), '__index_level_0__': Value(dtype='int64', id=None)}, post_processed=None, supervised_keys=None, task_templates=None, builder_name=None, dataset_name=None, config_name=None, version=None, splits=None, download_checksums=None, download_size=None, post_processing_size=None, dataset_size=None, size_in_bytes=None)
validation:
DatasetInfo(description='', citation='', homepage='', license='', features={'metadata': Value(dtype='string', id=None), 'text': Value(dtype='string', id=None), '1_legislation': Value(dtype='int64', id=None), '10_journaux': Value(dtype='int64', id=None), '12_presentations': Value(dtype='int64', id=None), '13_lettres': Value(dtype='int64', id=None), '2_rapport_evaluation': Value(dtype='int64', id=None), '3_rapport_comptes': Value(dtype='int64', id=None), '4_rapport_activite': Value(dtype='int64', id=None), '5_rapport_risque': Value(dtype='int64', id=None), '6_plan': Value(dtype='int64', id=None), '7_charte': Value(dtype='int64', id=None), '__index_level_0__': Value(dtype='int64', id=None)}, post_processed=None, supervised_keys=None, task_templates=None, builder_name=None, dataset_name=None, config_name=None, version=None, splits=None, download_checksums=None, download_size=None, post_processing_size=None, dataset_size=None, size_in_bytes=None)
/linkhome/rech/genrug01/uft12cr/.local/lib/python3.11/site-packages/transformers/convert_slow_tokenizer.py:515: UserWarning: The sentencepiece tokenizer that you are converting to a fast tokenizer uses the byte fallback option which is not implemented in the fast tokenizers. In practice this means that the fast version of the tokenizer can produce unknown tokens whereas the sentencepiece version would have converted these unknown tokens into a sequence of byte tokens matching the original piece of text.
warnings.warn(
Map: 0%| | 0/751 [00:00<?, ? examples/s]
Map: 100%|ββββββββββ| 751/751 [00:00<00:00, 2330.00 examples/s]
Map: 100%|ββββββββββ| 751/751 [00:00<00:00, 2298.12 examples/s]
Map: 0%| | 0/67 [00:00<?, ? examples/s]
Map: 100%|ββββββββββ| 67/67 [00:00<00:00, 3193.39 examples/s]
Map: 0%| | 0/66 [00:00<?, ? examples/s]
Map: 100%|ββββββββββ| 66/66 [00:00<00:00, 3290.16 examples/s]
/linkhome/rech/genrug01/uft12cr/.local/lib/python3.11/site-packages/torch/_utils.py:831: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
return self.fget.__get__(instance, owner)()
Some weights of DebertaV2ForSequenceClassification were not initialized from the model checkpoint at deberta-large and are newly initialized: ['classifier.bias', 'classifier.weight', 'pooler.dense.bias', 'pooler.dense.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
comet_ml is installed but `COMET_API_KEY` is not set.
Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none).
Detected kernel version 4.18.0, which is below the recommended minimum of 5.5.0; this can cause the process to hang. It is recommended to upgrade the kernel to the minimum version or higher.
2024/05/06 14:31:40 WARNING mlflow.utils.git_utils: Failed to import Git (the Git executable is probably not on your PATH), so Git SHA is not available. Error: Failed to initialize: Bad git executable.
The git executable must be specified in one of the following ways:
- be included in your $PATH
- be set via $GIT_PYTHON_GIT_EXECUTABLE
- explicitly set via git.refresh()
All git commands will error until this is rectified.
This initial warning can be silenced or aggravated in the future by setting the
$GIT_PYTHON_REFRESH environment variable. Use one of the following values:
- quiet|q|silence|s|none|n|0: for no warning or exception
- warn|w|warning|1: for a printed warning
- error|e|raise|r|2: for a raised exception
Example:
export GIT_PYTHON_REFRESH=quiet
COMET WARNING: Can not parse empty Comet API key
COMET INFO: No Comet API Key was found, creating an OfflineExperiment. Set up your API Key to get the full Comet experience https://www.comet.com/docs/python-sdk/advanced/#python-configuration
COMET WARNING: Can not parse empty Comet API key
COMET WARNING: To get all data logged automatically, import comet_ml before the following modules: sklearn, torch.
COMET INFO: Using '/gpfsdswork/projects/rech/fmr/uft12cr/classification/.cometml-runs' path as offline directory. Pass 'offline_directory' parameter into constructor or set the 'COMET_OFFLINE_DIRECTORY' environment variable to manually choose where to store offline experiment archives.
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
To disable this warning, you can either:
- Avoid using `tokenizers` before the fork if possible
- Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
To disable this warning, you can either:
- Avoid using `tokenizers` before the fork if possible
- Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
COMET INFO: Couldn't find a Git repository in '/gpfsdswork/projects/rech/fmr/uft12cr/classification' nor in any parent directory. Set `COMET_GIT_DIRECTORY` if your Git Repository is elsewhere.
0%| | 0/376 [00:00<?, ?it/s]
0%| | 1/376 [00:02<16:53, 2.70s/it]
1%| | 2/376 [00:02<07:32, 1.21s/it]
1%| | 3/376 [00:03<04:36, 1.35it/s]
1%| | 4/376 [00:03<03:14, 1.91it/s]
1%|β | 5/376 [00:03<02:28, 2.49it/s]
2%|β | 6/376 [00:03<02:01, 3.05it/s]
2%|β | 7/376 [00:03<01:43, 3.56it/s]
2%|β | 8/376 [00:03<01:32, 3.99it/s]
2%|β | 9/376 [00:04<01:24, 4.34it/s]
3%|β | 10/376 [00:04<01:19, 4.62it/s]
3%|β | 11/376 [00:04<01:15, 4.84it/s]
3%|β | 12/376 [00:04<01:13, 4.98it/s]
3%|β | 13/376 [00:04<01:11, 5.10it/s]
4%|β | 14/376 [00:05<01:09, 5.18it/s]
4%|β | 15/376 [00:05<01:08, 5.25it/s]
4%|β | 16/376 [00:05<01:08, 5.27it/s]
5%|β | 17/376 [00:05<01:07, 5.31it/s]
5%|β | 18/376 [00:05<01:07, 5.33it/s]
5%|β | 19/376 [00:06<01:06, 5.34it/s]
5%|β | 20/376 [00:06<01:06, 5.36it/s]
6%|β | 21/376 [00:06<01:06, 5.36it/s]
6%|β | 22/376 [00:06<01:06, 5.36it/s]
6%|β | 23/376 [00:06<01:05, 5.36it/s]
6%|β | 24/376 [00:06<01:05, 5.35it/s]
7%|β | 25/376 [00:07<01:05, 5.36it/s]
7%|β | 26/376 [00:07<01:05, 5.36it/s]
7%|β | 27/376 [00:07<01:05, 5.36it/s]
7%|β | 28/376 [00:07<01:04, 5.37it/s]
8%|β | 29/376 [00:07<01:04, 5.37it/s]
8%|β | 30/376 [00:08<01:04, 5.37it/s]
8%|β | 31/376 [00:08<01:04, 5.37it/s]
9%|β | 32/376 [00:08<01:04, 5.37it/s]
9%|β | 33/376 [00:08<01:03, 5.37it/s]
9%|β | 34/376 [00:08<01:03, 5.38it/s]
9%|β | 35/376 [00:09<01:03, 5.37it/s]
10%|β | 36/376 [00:09<01:03, 5.38it/s]
10%|β | 37/376 [00:09<01:03, 5.38it/s]
10%|β | 38/376 [00:09<01:02, 5.38it/s]
10%|β | 39/376 [00:09<01:02, 5.39it/s]
11%|β | 40/376 [00:09<01:02, 5.38it/s]
11%|β | 41/376 [00:10<01:02, 5.38it/s]
11%|β | 42/376 [00:10<01:02, 5.37it/s]
11%|ββ | 43/376 [00:10<01:02, 5.37it/s]
12%|ββ | 44/376 [00:10<01:01, 5.37it/s]
12%|ββ | 45/376 [00:10<01:01, 5.37it/s]
12%|ββ | 46/376 [00:11<01:01, 5.37it/s]
12%|ββ | 47/376 [00:11<01:01, 5.37it/s]
13%|ββ | 48/376 [00:11<01:01, 5.37it/s]
13%|ββ | 49/376 [00:11<01:00, 5.37it/s]
13%|ββ | 50/376 [00:11<01:00, 5.37it/s]
14%|ββ | 51/376 [00:11<01:00, 5.37it/s]
14%|ββ | 52/376 [00:12<01:00, 5.37it/s]
14%|ββ | 53/376 [00:12<01:00, 5.37it/s]
14%|ββ | 54/376 [00:12<00:59, 5.37it/s]
15%|ββ | 55/376 [00:12<00:59, 5.37it/s]
15%|ββ | 56/376 [00:12<00:59, 5.37it/s]
15%|ββ | 57/376 [00:13<00:59, 5.37it/s]
15%|ββ | 58/376 [00:13<00:59, 5.37it/s]
16%|ββ | 59/376 [00:13<00:59, 5.37it/s]
16%|ββ | 60/376 [00:13<00:58, 5.36it/s]
16%|ββ | 61/376 [00:13<00:58, 5.37it/s]
16%|ββ | 62/376 [00:14<00:58, 5.37it/s]
17%|ββ | 63/376 [00:14<00:58, 5.37it/s]
17%|ββ | 64/376 [00:14<00:58, 5.37it/s]
17%|ββ | 65/376 [00:14<00:57, 5.37it/s]
18%|ββ | 66/376 [00:14<00:57, 5.36it/s]
18%|ββ | 67/376 [00:14<00:57, 5.35it/s]
18%|ββ | 68/376 [00:15<00:57, 5.34it/s]
18%|ββ | 69/376 [00:15<00:57, 5.34it/s]
19%|ββ | 70/376 [00:15<00:57, 5.35it/s]
19%|ββ | 71/376 [00:15<00:56, 5.36it/s]
19%|ββ | 72/376 [00:15<00:56, 5.36it/s]
19%|ββ | 73/376 [00:16<00:56, 5.36it/s]
20%|ββ | 74/376 [00:16<00:56, 5.36it/s]
20%|ββ | 75/376 [00:16<00:56, 5.36it/s]
20%|ββ | 76/376 [00:16<00:55, 5.36it/s]
20%|ββ | 77/376 [00:16<00:55, 5.36it/s]
21%|ββ | 78/376 [00:17<00:55, 5.36it/s]
21%|ββ | 79/376 [00:17<00:55, 5.36it/s]
21%|βββ | 80/376 [00:17<00:55, 5.36it/s]
22%|βββ | 81/376 [00:17<00:55, 5.36it/s]
22%|βββ | 82/376 [00:17<00:54, 5.36it/s]
22%|βββ | 83/376 [00:17<00:54, 5.36it/s]
22%|βββ | 84/376 [00:18<00:54, 5.36it/s]
23%|βββ | 85/376 [00:18<00:54, 5.35it/s]
23%|βββ | 86/376 [00:18<00:54, 5.35it/s]
23%|βββ | 87/376 [00:18<00:54, 5.35it/s]
23%|βββ | 88/376 [00:18<00:53, 5.35it/s]
24%|βββ | 89/376 [00:19<00:53, 5.36it/s]
24%|βββ | 90/376 [00:19<00:53, 5.37it/s]
24%|βββ | 91/376 [00:19<00:53, 5.37it/s]
24%|βββ | 92/376 [00:19<00:52, 5.37it/s]
25%|βββ | 93/376 [00:19<00:52, 5.37it/s]
25%|βββ | 94/376 [00:19<00:51, 5.51it/s]
0%| | 0/9 [00:00<?, ?it/s][A
33%|ββββ | 3/9 [00:00<00:00, 28.07it/s][A
67%|βββββββ | 6/9 [00:00<00:00, 21.69it/s][A
100%|ββββββββββ| 9/9 [00:00<00:00, 22.59it/s][ATraceback (most recent call last):
File "/gpfsdswork/projects/rech/fmr/uft12cr/classification/deberta_training_multi.py", line 138, in <module>
trainer.train()
File "/linkhome/rech/genrug01/uft12cr/.local/lib/python3.11/site-packages/transformers/trainer.py", line 1561, in train
return inner_training_loop(
^^^^^^^^^^^^^^^^^^^^
File "/linkhome/rech/genrug01/uft12cr/.local/lib/python3.11/site-packages/transformers/trainer.py", line 1968, in _inner_training_loop
self._maybe_log_save_evaluate(tr_loss, model, trial, epoch, ignore_keys_for_eval)
File "/linkhome/rech/genrug01/uft12cr/.local/lib/python3.11/site-packages/transformers/trainer.py", line 2329, in _maybe_log_save_evaluate
metrics = self.evaluate(ignore_keys=ignore_keys_for_eval)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/linkhome/rech/genrug01/uft12cr/.local/lib/python3.11/site-packages/transformers/trainer.py", line 3136, in evaluate
output = eval_loop(
^^^^^^^^^^
File "/linkhome/rech/genrug01/uft12cr/.local/lib/python3.11/site-packages/transformers/trainer.py", line 3427, in evaluation_loop
metrics = self.compute_metrics(EvalPrediction(predictions=all_preds, label_ids=all_labels))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/gpfsdswork/projects/rech/fmr/uft12cr/classification/deberta_training_multi.py", line 124, in compute_metrics
result = multi_label_metrics(
^^^^^^^^^^^^^^^^^^^^
File "/gpfsdswork/projects/rech/fmr/uft12cr/classification/deberta_training_multi.py", line 102, in multi_label_metrics
metrics_per_label = precision_recall_fscore_support(labels, y_pred, average=None)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
NameError: name 'precision_recall_fscore_support' is not defined
100%|ββββββββββ| 9/9 [00:11<00:00, 22.59it/s][ACOMET INFO: ---------------------------------------------------------------------------------------
COMET INFO: Comet.ml OfflineExperiment Summary
COMET INFO: ---------------------------------------------------------------------------------------
COMET INFO: Data:
COMET INFO: display_summary_level : 1
COMET INFO: name : deberta-classification-dila
COMET INFO: url : [OfflineExperiment will get URL after upload]
COMET INFO: Others:
COMET INFO: Created from : MLFlow auto-logger
COMET INFO: Name : deberta-classification-dila
COMET INFO: offline_experiment : True
COMET INFO: Parameters:
COMET INFO: _name_or_path : deberta-large
COMET INFO: adafactor : False
COMET INFO: adam_beta1 : 0.9
COMET INFO: adam_beta2 : 0.999
COMET INFO: adam_epsilon : 1e-08
COMET INFO: add_cross_attention : False
COMET INFO: architectures : None
COMET INFO: attention_probs_dropout_prob : 0.1
COMET INFO: auto_find_batch_size : False
COMET INFO: bad_words_ids : None
COMET INFO: begin_suppress_tokens : None
COMET INFO: bf16 : False
COMET INFO: bf16_full_eval : False
COMET INFO: bos_token_id : None
COMET INFO: chunk_size_feed_forward : 0
COMET INFO: cross_attention_hidden_size : None
COMET INFO: data_seed : None
COMET INFO: dataloader_drop_last : False
COMET INFO: dataloader_num_workers : 0
COMET INFO: dataloader_persistent_workers : False
COMET INFO: dataloader_pin_memory : True
COMET INFO: dataloader_prefetch_factor : None
COMET INFO: ddp_backend : None
COMET INFO: ddp_broadcast_buffers : None
COMET INFO: ddp_bucket_cap_mb : None
COMET INFO: ddp_find_unused_parameters : None
COMET INFO: ddp_timeout : 1800
COMET INFO: debug : []
COMET INFO: decoder_start_token_id : None
COMET INFO: deepspeed : None
COMET INFO: disable_tqdm : False
COMET INFO: dispatch_batches : None
COMET INFO: diversity_penalty : 0.0
COMET INFO: do_eval : True
COMET INFO: do_predict : False
COMET INFO: do_sample : False
COMET INFO: do_train : False
COMET INFO: early_stopping : False
COMET INFO: encoder_no_repeat_ngram_size : 0
COMET INFO: eos_token_id : None
COMET INFO: eval_accumulation_steps : None
COMET INFO: eval_delay : 0
COMET INFO: eval_steps : None
COMET INFO: evaluation_strategy : epoch
COMET INFO: exponential_decay_length_penalty : None
COMET INFO: finetuning_task : None
COMET INFO: forced_bos_token_id : None
COMET INFO: forced_eos_token_id : None
COMET INFO: fp16 : False
COMET INFO: fp16_backend : auto
COMET INFO: fp16_full_eval : False
COMET INFO: fp16_opt_level : O1
COMET INFO: fsdp : []
COMET INFO: fsdp_config : {"min_num_params": 0, "xla": false, "xla_fsdp_grad_ckpt": false}
COMET INFO: fsdp_min_num_params : 0
COMET INFO: fsdp_transformer_layer_cls_to_wrap : None
COMET INFO: full_determinism : False
COMET INFO: gradient_accumulation_steps : 1
COMET INFO: gradient_checkpointing : False
COMET INFO: gradient_checkpointing_kwargs : None
COMET INFO: greater_is_better : True
COMET INFO: group_by_length : False
COMET INFO: half_precision_backend : auto
COMET INFO: hidden_act : gelu
COMET INFO: hidden_dropout_prob : 0.1
COMET INFO: hidden_size : 768
COMET INFO: hub_always_push : False
COMET INFO: hub_model_id : None
COMET INFO: hub_private_repo : False
COMET INFO: hub_strategy : every_save
COMET INFO: hub_token : <HUB_TOKEN>
COMET INFO: id2label : {"0": "1_legislation", "1": "10_journaux", "2": "12_presentations", "3": "13_lettres", "4": "2_rapport_evaluation", "5": "3_rapport_comptes", "6": "4_rapport_activite", "7": "5_rapport_risque", "8": "6_plan", "9": "7_charte", "10": "__index_level_0__"}
COMET INFO: ignore_data_skip : False
COMET INFO: include_inputs_for_metrics : False
COMET INFO: include_num_input_tokens_seen : False
COMET INFO: include_tokens_per_second : False
COMET INFO: initializer_range : 0.02
COMET INFO: intermediate_size : 3072
COMET INFO: is_decoder : False
COMET INFO: is_encoder_decoder : False
COMET INFO: jit_mode_eval : False
COMET INFO: label2id : {"10_journaux": 1, "12_presentations": 2, "13_lettres": 3, "1_legislation": 0, "2_rapport_evaluation": 4, "3_rapport_comptes": 5, "4_rapport_activite": 6, "5_rapport_risque": 7, "6_plan": 8, "7_charte": 9, "__index_level_0__": 10}
COMET INFO: label_names : None
COMET INFO: label_smoothing_factor : 0.0
COMET INFO: layer_norm_eps : 1e-07
COMET INFO: learning_rate : 1e-05
COMET INFO: length_column_name : length
COMET INFO: length_penalty : 1.0
COMET INFO: load_best_model_at_end : True
COMET INFO: local_rank : 0
COMET INFO: log_level : passive
COMET INFO: log_level_replica : warning
COMET INFO: log_on_each_node : True
COMET INFO: logging_dir : deberta-classification-dila/runs/May06_14-31-32_r6i2n2
COMET INFO: logging_first_step : False
COMET INFO: logging_nan_inf_filter : True
COMET INFO: logging_steps : 500
COMET INFO: logging_strategy : steps
COMET INFO: lr_scheduler_kwargs : {}
COMET INFO: lr_scheduler_type : linear
COMET INFO: max_grad_norm : 1.0
COMET INFO: max_length : 20
COMET INFO: max_position_embeddings : 512
COMET INFO: max_relative_positions : -1
COMET INFO: max_steps : -1
COMET INFO: metric_for_best_model : f1
COMET INFO: min_length : 0
COMET INFO: model_type : deberta-v2
COMET INFO: mp_parameters :
COMET INFO: neftune_noise_alpha : None
COMET INFO: no_cuda : False
COMET INFO: no_repeat_ngram_size : 0
COMET INFO: norm_rel_ebd : layer_norm
COMET INFO: num_attention_heads : 12
COMET INFO: num_beam_groups : 1
COMET INFO: num_beams : 1
COMET INFO: num_hidden_layers : 12
COMET INFO: num_return_sequences : 1
COMET INFO: num_train_epochs : 4
COMET INFO: optim : adamw_torch
COMET INFO: optim_args : None
COMET INFO: output_attentions : False
COMET INFO: output_dir : deberta-classification-dila
COMET INFO: output_hidden_states : False
COMET INFO: output_scores : False
COMET INFO: overwrite_output_dir : False
COMET INFO: pad_token_id : 0
COMET INFO: past_index : -1
COMET INFO: per_device_eval_batch_size : 8
COMET INFO: per_device_train_batch_size : 8
COMET INFO: per_gpu_eval_batch_size : None
COMET INFO: per_gpu_train_batch_size : None
COMET INFO: pooler_dropout : 0
COMET INFO: pooler_hidden_act : gelu
COMET INFO: pooler_hidden_size : 768
COMET INFO: pos_att_type : ['p2c', 'c2p']
COMET INFO: position_biased_input : False
COMET INFO: position_buckets : 256
COMET INFO: prediction_loss_only : False
COMET INFO: prefix : None
COMET INFO: problem_type : multi_label_classification
COMET INFO: pruned_heads : {}
COMET INFO: push_to_hub : False
COMET INFO: push_to_hub_model_id : None
COMET INFO: push_to_hub_organization : None
COMET INFO: push_to_hub_token : <PUSH_TO_HUB_TOKEN>
COMET INFO: ray_scope : last
COMET INFO: relative_attention : True
COMET INFO: remove_invalid_values : False
COMET INFO: remove_unused_columns : True
COMET INFO: repetition_penalty : 1.0
COMET INFO: report_to : ['mlflow', 'tensorboard']
COMET INFO: resume_from_checkpoint : None
COMET INFO: return_dict : True
COMET INFO: return_dict_in_generate : False
COMET INFO: run_name : deberta-classification-dila
COMET INFO: save_on_each_node : False
COMET INFO: save_only_model : False
COMET INFO: save_safetensors : True
COMET INFO: save_steps : 500
COMET INFO: save_strategy : epoch
COMET INFO: save_total_limit : None
COMET INFO: seed : 42
COMET INFO: sep_token_id : None
COMET INFO: share_att_key : True
COMET INFO: skip_memory_metrics : True
COMET INFO: split_batches : False
COMET INFO: suppress_tokens : None
COMET INFO: task_specific_params : None
COMET INFO: temperature : 1.0
COMET INFO: tf32 : None
COMET INFO: tf_legacy_loss : False
COMET INFO: tie_encoder_decoder : False
COMET INFO: tie_word_embeddings : True
COMET INFO: tokenizer_class : None
COMET INFO: top_k : 50
COMET INFO: top_p : 1.0
COMET INFO: torch_compile : False
COMET INFO: torch_compile_backend : None
COMET INFO: torch_compile_mode : None
COMET INFO: torch_dtype : None
COMET INFO: torchdynamo : None
COMET INFO: torchscript : False
COMET INFO: tpu_metrics_debug : False
COMET INFO: tpu_num_cores : None
COMET INFO: transformers_version : 4.38.0.dev0
COMET INFO: type_vocab_size : 0
COMET INFO: typical_p : 1.0
COMET INFO: use_bfloat16 : False
COMET INFO: use_cpu : False
COMET INFO: use_ipex : False
COMET INFO: use_legacy_prediction_loop : False
COMET INFO: use_mps_device : False
COMET INFO: vocab_size : 251000
COMET INFO: warmup_ratio : 0.0
COMET INFO: warmup_steps : 0
COMET INFO: weight_decay : 0.01
COMET INFO: Uploads:
COMET INFO: conda-environment-definition : 1
COMET INFO: conda-info : 1
COMET INFO: conda-specification : 1
COMET INFO: environment details : 1
COMET INFO: filename : 1
COMET INFO: installed packages : 1
COMET INFO: source_code : 1 (4.46 KB)
COMET INFO:
COMET WARNING: To get all data logged automatically, import comet_ml before the following modules: sklearn, torch.
COMET INFO: Still saving offline stats to messages file before program termination (may take up to 120 seconds)
COMET INFO: Starting saving the offline archive
COMET INFO: To upload this offline experiment, run:
comet upload /gpfsdswork/projects/rech/fmr/uft12cr/classification/.cometml-runs/a7c8e67565944e0b877cc72ae9023b53.zip
Exception ignored in: <function tqdm.__del__ at 0x150062d53880>
Traceback (most recent call last):
File "/linkhome/rech/genrug01/uft12cr/.local/lib/python3.11/site-packages/tqdm/std.py", line 1149, in __del__
File "/linkhome/rech/genrug01/uft12cr/.local/lib/python3.11/site-packages/tqdm/std.py", line 1303, in close
File "/linkhome/rech/genrug01/uft12cr/.local/lib/python3.11/site-packages/tqdm/std.py", line 1496, in display
File "/linkhome/rech/genrug01/uft12cr/.local/lib/python3.11/site-packages/tqdm/std.py", line 1152, in __str__
File "/linkhome/rech/genrug01/uft12cr/.local/lib/python3.11/site-packages/tqdm/std.py", line 1454, in format_dict
File "/linkhome/rech/genrug01/uft12cr/.local/lib/python3.11/site-packages/tqdm/utils.py", line 335, in _screen_shape_linux
File "<frozen importlib._bootstrap>", line 1173, in _find_and_load
File "<frozen importlib._bootstrap>", line 170, in __enter__
File "<frozen importlib._bootstrap>", line 196, in _get_module_lock
File "<frozen importlib._bootstrap>", line 72, in __init__
TypeError: 'NoneType' object is not callable
[A |