HorcruxNo13's picture
Model save
85ec499
metadata
license: apache-2.0
base_model: google/vit-base-patch16-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
  - precision
  - recall
model-index:
  - name: vit-base-patch16-224
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: validation
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.79
          - name: Precision
            type: precision
            value: 0.7955164222268126
          - name: Recall
            type: recall
            value: 0.79

vit-base-patch16-224

This model is a fine-tuned version of google/vit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6740
  • Accuracy: 0.79
  • Precision: 0.7955
  • Recall: 0.79
  • F1 Score: 0.7923

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Recall F1 Score
No log 1.0 4 0.5895 0.725 0.5256 0.725 0.6094
No log 2.0 8 0.5737 0.725 0.5256 0.725 0.6094
No log 3.0 12 0.5746 0.7333 0.6978 0.7333 0.6589
No log 4.0 16 0.5449 0.7292 0.7126 0.7292 0.6263
No log 5.0 20 0.5943 0.7208 0.7362 0.7208 0.7270
No log 6.0 24 0.5124 0.75 0.7360 0.75 0.6895
No log 7.0 28 0.6057 0.6625 0.7301 0.6625 0.6797
No log 8.0 32 0.5059 0.7583 0.7376 0.7583 0.7214
No log 9.0 36 0.5734 0.7125 0.7474 0.7125 0.7237
No log 10.0 40 0.5069 0.7458 0.7182 0.7458 0.7116
No log 11.0 44 0.5135 0.775 0.7659 0.775 0.7689
No log 12.0 48 0.4943 0.775 0.7601 0.775 0.7610
0.5275 13.0 52 0.5654 0.7458 0.7790 0.7458 0.7557
0.5275 14.0 56 0.5257 0.7625 0.7636 0.7625 0.7631
0.5275 15.0 60 0.5107 0.7875 0.7813 0.7875 0.7836
0.5275 16.0 64 0.5514 0.7333 0.7655 0.7333 0.7434
0.5275 17.0 68 0.5004 0.7833 0.7698 0.7833 0.7699
0.5275 18.0 72 0.5999 0.7125 0.7738 0.7125 0.7269
0.5275 19.0 76 0.4975 0.7667 0.7554 0.7667 0.7589
0.5275 20.0 80 0.5120 0.7917 0.7981 0.7917 0.7944
0.5275 21.0 84 0.5203 0.7833 0.7876 0.7833 0.7853
0.5275 22.0 88 0.5304 0.8042 0.8051 0.8042 0.8046
0.5275 23.0 92 0.5475 0.825 0.825 0.825 0.8250
0.5275 24.0 96 0.5757 0.7458 0.7661 0.7458 0.7531
0.2422 25.0 100 0.5669 0.7875 0.7829 0.7875 0.7848
0.2422 26.0 104 0.5489 0.7958 0.7931 0.7958 0.7943
0.2422 27.0 108 0.5372 0.8 0.7982 0.8 0.7990
0.2422 28.0 112 0.5500 0.8208 0.8160 0.8208 0.8176
0.2422 29.0 116 0.5682 0.8042 0.8033 0.8042 0.8037
0.2422 30.0 120 0.5899 0.8083 0.8050 0.8083 0.8064
0.2422 31.0 124 0.6217 0.8 0.8063 0.8 0.8026
0.2422 32.0 128 0.6063 0.8125 0.8053 0.8125 0.8068
0.2422 33.0 132 0.5843 0.8042 0.8033 0.8042 0.8037
0.2422 34.0 136 0.6020 0.8125 0.8073 0.8125 0.8091
0.2422 35.0 140 0.6180 0.8042 0.8092 0.8042 0.8063
0.2422 36.0 144 0.6287 0.8208 0.8171 0.8208 0.8186
0.2422 37.0 148 0.6231 0.825 0.8234 0.825 0.8242
0.0631 38.0 152 0.6260 0.8292 0.8300 0.8292 0.8296
0.0631 39.0 156 0.6278 0.8333 0.8294 0.8333 0.8308
0.0631 40.0 160 0.6325 0.8208 0.8200 0.8208 0.8204
0.0631 41.0 164 0.6370 0.8083 0.8013 0.8083 0.8032
0.0631 42.0 168 0.6371 0.8125 0.8100 0.8125 0.8111
0.0631 43.0 172 0.6404 0.8042 0.8016 0.8042 0.8027
0.0631 44.0 176 0.6640 0.8292 0.8227 0.8292 0.8229
0.0631 45.0 180 0.6636 0.8208 0.8185 0.8208 0.8195
0.0631 46.0 184 0.6826 0.8083 0.8122 0.8083 0.8100
0.0631 47.0 188 0.6756 0.8208 0.8185 0.8208 0.8195
0.0631 48.0 192 0.6695 0.8292 0.8246 0.8292 0.8261
0.0631 49.0 196 0.6669 0.825 0.8198 0.825 0.8213
0.0264 50.0 200 0.6658 0.825 0.8198 0.825 0.8213

Framework versions

  • Transformers 4.33.3
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3