ansilmbabl's picture
End of training
50782a5 verified
metadata
license: apache-2.0
base_model: >-
  ansilmbabl/vit-base-patch16-224-in21k-cards-june-07-cropping-filtered-preprocess-change-test
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: >-
      vit-base-patch16-224-in21k-cards-june-08-cropping-filtered-preprocess-change-test-2
    results: []

vit-base-patch16-224-in21k-cards-june-08-cropping-filtered-preprocess-change-test-2

This model is a fine-tuned version of ansilmbabl/vit-base-patch16-224-in21k-cards-june-07-cropping-filtered-preprocess-change-test on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.5958
  • Accuracy: 0.5147

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 512
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Accuracy Validation Loss
1.0182 0.9998 1298 0.4287 1.5280
0.9583 1.9996 2596 0.4475 1.4878
0.8452 2.9998 3894 1.4847 0.4716
0.6887 3.9996 5192 1.5848 0.4736
0.5269 4.9994 6490 1.6689 0.493
0.4018 6.0 7789 1.8483 0.4986
0.2909 6.9998 9087 2.0319 0.5079
0.1823 7.9996 10385 2.2540 0.5127
0.1056 8.9994 11683 2.4652 0.511
0.0767 9.9985 12980 2.5958 0.5147

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.19.2
  • Tokenizers 0.19.1