Accuracy?
#1
by
ayazdan
- opened
I am using the following script to measure model accuracy on imagenet-1k:
python3 transformer-sparsity/examples/pytorch/image-classification/run_image_classification.py \
--model_name_or_path ${model} \
--dataset_name imagenet-1k \
--do_eval \
--remove_unused_columns false \
--per_device_eval_batch_size 32 \
--use_auth_token true \
--overwrite_output_dir \
--save_steps 1 \
--ignore_mismatched_sizes true \
--output_dir ~/${base_dir} 2>&1 | tee ~/${base_dir}/${model_simplified}_test.txt
The eval_accuracy is 0.0011. Is this expected?
It could be because of the difference for class sizes, giving the following error:
[WARNING|modeling_utils.py:2628] 2022-12-07 03:53:36,920 >> Some weights of MobileNetV1ForImageClassification were not initialized from the model checkpoint at Matthijs/mobilenet_v1_0.75_192 and are newly initialized because the shapes did not match:
- classifier.weight: found shape torch.Size([1001, 768]) in the checkpoint and torch.Size([1000, 768]) in the model instantiated
- classifier.bias: found shape torch.Size([1001]) in the checkpoint and torch.Size([1000]) in the model instantiated
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
I have tried the following training as well but so far the accuracy is very low eval_accuracy -> 0.001
lr=0.01
epochs=20
model=${1}
base_dir=mobilenetv1_lr${lr}_epochs_${epochs}_optim_${optim}_lr_schedule_${lr_schedule}
rm -rf ${base_dir}
python3 transformer-sparsity/examples/pytorch/image-classification/run_image_classification.py \
--model_name_or_path ${model} \
--dataset_name imagenet-1k \
--remove_unused_columns false \
--per_device_train_batch_size 256 \
--per_device_eval_batch_size 256 \
--do_train \
--do_eval \
--lr_scheduler_type constant \
--weight_decay 0.00004 \
--load_best_model_at_end \
--eval_steps ${eval_steps} \
--save_steps ${eval_steps} \
--evaluation_strategy steps \
--logging_steps ${eval_steps} \
--logging_strategy steps \
--num_train_epochs ${epochs} \
--learning_rate ${lr} \
--optim rmsprop \
--ignore_mismatched_sizes true \
--save_total_limit 2 \
--overwrite_output_dir \
--output_dir ~/${base_dir} 2>&1 | tee ~/${base_dir}/mobilenetv1_finetune.txt
I don't understand why my accuracy is horrible, too.