ayubkfupm's picture
Model save
cb91170 verified
metadata
license: apache-2.0
base_model: microsoft/swin-base-patch4-window7-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: swin-base-patch4-window7-224-finetuned-st-wsdmhar
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.7937443336355394

swin-base-patch4-window7-224-finetuned-st-wsdmhar

This model is a fine-tuned version of microsoft/swin-base-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8737
  • Accuracy: 0.7937

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.7172 0.9938 40 1.6300 0.3908
1.2025 1.9876 80 1.0530 0.5290
0.9391 2.9814 120 0.8234 0.5997
0.858 4.0 161 0.7816 0.6179
0.7816 4.9938 201 0.7162 0.6328
0.7094 5.9876 241 0.6275 0.6936
0.7139 6.9814 281 0.5860 0.7203
0.6836 8.0 322 0.6788 0.6777
0.7075 8.9938 362 0.6239 0.7135
0.6309 9.9876 402 0.6282 0.6995
0.6596 10.9814 442 0.6014 0.6995
0.6589 12.0 483 0.5762 0.7208
0.5947 12.9938 523 0.5150 0.7588
0.5929 13.9876 563 0.5373 0.7403
0.5775 14.9814 603 0.5327 0.7584
0.5465 16.0 644 0.5778 0.7421
0.5614 16.9938 684 0.4826 0.7747
0.5535 17.9876 724 0.4898 0.7788
0.4965 18.9814 764 0.5204 0.7539
0.5177 20.0 805 0.4869 0.7693
0.513 20.9938 845 0.4848 0.7797
0.4824 21.9876 885 0.4821 0.7788
0.5241 22.9814 925 0.4995 0.7715
0.4363 24.0 966 0.5120 0.7806
0.4549 24.9938 1006 0.5031 0.7811
0.4445 25.9876 1046 0.5340 0.7665
0.4337 26.9814 1086 0.5099 0.7783
0.4272 28.0 1127 0.5166 0.7824
0.3732 28.9938 1167 0.5644 0.7620
0.4252 29.9876 1207 0.5171 0.7856
0.3844 30.9814 1247 0.5678 0.7756
0.3663 32.0 1288 0.5667 0.7743
0.3456 32.9938 1328 0.5701 0.7688
0.3809 33.9876 1368 0.5687 0.7729
0.3499 34.9814 1408 0.5615 0.7820
0.3443 36.0 1449 0.5977 0.7806
0.2993 36.9938 1489 0.6292 0.7806
0.3333 37.9876 1529 0.6492 0.7715
0.3586 38.9814 1569 0.6130 0.7756
0.2979 40.0 1610 0.5870 0.7806
0.266 40.9938 1650 0.6225 0.7833
0.2585 41.9876 1690 0.6603 0.7765
0.2741 42.9814 1730 0.6642 0.7752
0.2674 44.0 1771 0.6706 0.7851
0.2619 44.9938 1811 0.6730 0.7715
0.252 45.9876 1851 0.7346 0.7811
0.2417 46.9814 1891 0.6707 0.7829
0.2406 48.0 1932 0.6497 0.7833
0.2348 48.9938 1972 0.6786 0.7833
0.2265 49.9876 2012 0.7158 0.7906
0.1967 50.9814 2052 0.7403 0.7947
0.1932 52.0 2093 0.7282 0.7869
0.2047 52.9938 2133 0.6987 0.7842
0.1966 53.9876 2173 0.7779 0.7851
0.1824 54.9814 2213 0.7815 0.7910
0.1963 56.0 2254 0.6768 0.7933
0.1984 56.9938 2294 0.7527 0.7833
0.1777 57.9876 2334 0.7672 0.7865
0.1666 58.9814 2374 0.7881 0.7901
0.1649 60.0 2415 0.7903 0.7856
0.1785 60.9938 2455 0.7483 0.7874
0.167 61.9876 2495 0.7278 0.7915
0.1514 62.9814 2535 0.8130 0.7761
0.1423 64.0 2576 0.8144 0.7829
0.1476 64.9938 2616 0.8000 0.7815
0.1742 65.9876 2656 0.7660 0.7815
0.1362 66.9814 2696 0.8117 0.7888
0.126 68.0 2737 0.8394 0.7874
0.1278 68.9938 2777 0.8493 0.7847
0.1181 69.9876 2817 0.7959 0.7937
0.1457 70.9814 2857 0.8036 0.7860
0.1322 72.0 2898 0.8474 0.8001
0.1312 72.9938 2938 0.8026 0.7774
0.1146 73.9876 2978 0.8388 0.8064
0.141 74.9814 3018 0.8053 0.7987
0.1396 76.0 3059 0.8439 0.7937
0.113 76.9938 3099 0.9004 0.7919
0.1219 77.9876 3139 0.8423 0.7951
0.1132 78.9814 3179 0.8309 0.7937
0.119 80.0 3220 0.8210 0.8015
0.111 80.9938 3260 0.8238 0.7983
0.0973 81.9876 3300 0.8422 0.7983
0.1118 82.9814 3340 0.8389 0.8010
0.1296 84.0 3381 0.8178 0.8019
0.089 84.9938 3421 0.8456 0.7987
0.1003 85.9876 3461 0.8626 0.8001
0.1123 86.9814 3501 0.8494 0.7928
0.1038 88.0 3542 0.8584 0.8064
0.1055 88.9938 3582 0.8513 0.7933
0.1031 89.9876 3622 0.8592 0.7978
0.1028 90.9814 3662 0.8452 0.7969
0.0998 92.0 3703 0.8605 0.7983
0.1005 92.9938 3743 0.8805 0.7947
0.0936 93.9876 3783 0.8735 0.7969
0.0779 94.9814 3823 0.8776 0.7960
0.0972 96.0 3864 0.8784 0.7951
0.0973 96.9938 3904 0.8782 0.7933
0.0932 97.9876 3944 0.8779 0.7924
0.0863 98.9814 3984 0.8741 0.7919
0.0827 99.3789 4000 0.8737 0.7937

Framework versions

  • Transformers 4.44.0
  • Pytorch 1.12.1+cu113
  • Datasets 2.21.0
  • Tokenizers 0.19.1