Edit model card

dit-base_tobacco-small_tobacco3482_kd_CEKD_t2.5_a0.5

This model is a fine-tuned version of WinKawaks/vit-small-patch16-224 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6146
  • Accuracy: 0.8
  • Brier Loss: 0.2784
  • Nll: 1.4268
  • F1 Micro: 0.8000
  • F1 Macro: 0.7846
  • Ece: 0.1626
  • Aurc: 0.0474

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 1.0 7 4.1581 0.18 0.8974 4.2254 0.18 0.1559 0.2651 0.8061
No log 2.0 14 3.2929 0.355 0.7710 4.0541 0.3550 0.2167 0.2742 0.4326
No log 3.0 21 2.2155 0.55 0.5837 2.0462 0.55 0.4296 0.2323 0.2481
No log 4.0 28 1.5197 0.7 0.4370 1.7716 0.7 0.6411 0.2342 0.1327
No log 5.0 35 1.2831 0.715 0.4289 1.7142 0.715 0.6859 0.2047 0.1211
No log 6.0 42 1.2204 0.72 0.3989 1.6102 0.72 0.6999 0.1961 0.1066
No log 7.0 49 0.9767 0.755 0.3317 1.5919 0.755 0.7148 0.1724 0.0775
No log 8.0 56 0.8875 0.785 0.3049 1.4209 0.785 0.7633 0.1478 0.0716
No log 9.0 63 0.9311 0.79 0.3185 1.5420 0.79 0.7474 0.1645 0.0741
No log 10.0 70 0.8116 0.835 0.2672 1.5127 0.835 0.8232 0.1463 0.0563
No log 11.0 77 0.8315 0.805 0.3054 1.6275 0.805 0.7897 0.1695 0.0618
No log 12.0 84 0.7678 0.815 0.2917 1.5009 0.815 0.8012 0.1469 0.0542
No log 13.0 91 0.7249 0.81 0.2816 1.4685 0.81 0.7880 0.1437 0.0576
No log 14.0 98 0.8116 0.815 0.2894 1.5975 0.815 0.7941 0.1481 0.0604
No log 15.0 105 0.7985 0.81 0.3098 1.4721 0.81 0.7819 0.1646 0.0662
No log 16.0 112 0.6839 0.815 0.2781 1.4357 0.815 0.7992 0.1589 0.0529
No log 17.0 119 0.6590 0.82 0.2670 1.4487 0.82 0.8061 0.1336 0.0461
No log 18.0 126 0.7253 0.81 0.2938 1.5163 0.81 0.7951 0.1617 0.0558
No log 19.0 133 0.6935 0.795 0.2949 1.4516 0.795 0.7758 0.1736 0.0531
No log 20.0 140 0.6991 0.795 0.2875 1.3932 0.795 0.7735 0.1584 0.0519
No log 21.0 147 0.7059 0.815 0.2966 1.5011 0.815 0.7927 0.1579 0.0565
No log 22.0 154 0.6754 0.79 0.2896 1.4549 0.79 0.7742 0.1534 0.0531
No log 23.0 161 0.6981 0.785 0.2989 1.4261 0.785 0.7705 0.1490 0.0530
No log 24.0 168 0.6503 0.805 0.2842 1.4998 0.805 0.7885 0.1415 0.0512
No log 25.0 175 0.6680 0.79 0.2891 1.4228 0.79 0.7742 0.1504 0.0519
No log 26.0 182 0.6835 0.81 0.2948 1.4400 0.81 0.7944 0.1545 0.0516
No log 27.0 189 0.6495 0.81 0.2846 1.4433 0.81 0.7868 0.1552 0.0503
No log 28.0 196 0.6450 0.81 0.2851 1.4037 0.81 0.7913 0.1476 0.0498
No log 29.0 203 0.6634 0.815 0.2861 1.4186 0.815 0.7966 0.1397 0.0521
No log 30.0 210 0.6212 0.805 0.2739 1.4265 0.805 0.7902 0.1444 0.0482
No log 31.0 217 0.6271 0.815 0.2800 1.4392 0.815 0.7986 0.1370 0.0494
No log 32.0 224 0.6256 0.8 0.2786 1.3677 0.8000 0.7811 0.1454 0.0496
No log 33.0 231 0.6219 0.805 0.2779 1.4276 0.805 0.7857 0.1580 0.0465
No log 34.0 238 0.6203 0.81 0.2779 1.4392 0.81 0.7914 0.1275 0.0470
No log 35.0 245 0.6193 0.81 0.2793 1.4258 0.81 0.7934 0.1438 0.0483
No log 36.0 252 0.6261 0.83 0.2743 1.4227 0.83 0.8098 0.1482 0.0501
No log 37.0 259 0.6190 0.815 0.2776 1.4301 0.815 0.7977 0.1446 0.0484
No log 38.0 266 0.6210 0.805 0.2867 1.4958 0.805 0.7878 0.1477 0.0496
No log 39.0 273 0.5974 0.805 0.2771 1.5068 0.805 0.7901 0.1381 0.0476
No log 40.0 280 0.6224 0.8 0.2869 1.4325 0.8000 0.7869 0.1443 0.0472
No log 41.0 287 0.6178 0.805 0.2796 1.4316 0.805 0.7912 0.1454 0.0471
No log 42.0 294 0.6194 0.825 0.2765 1.5001 0.825 0.8059 0.1401 0.0474
No log 43.0 301 0.6224 0.805 0.2769 1.4268 0.805 0.7888 0.1398 0.0493
No log 44.0 308 0.6265 0.8 0.2819 1.4401 0.8000 0.7846 0.1422 0.0481
No log 45.0 315 0.6275 0.8 0.2819 1.4206 0.8000 0.7847 0.1465 0.0487
No log 46.0 322 0.6173 0.805 0.2806 1.3618 0.805 0.7870 0.1383 0.0478
No log 47.0 329 0.6177 0.81 0.2804 1.4988 0.81 0.7906 0.1468 0.0488
No log 48.0 336 0.6175 0.81 0.2788 1.4356 0.81 0.7917 0.1460 0.0476
No log 49.0 343 0.6209 0.81 0.2775 1.4290 0.81 0.7925 0.1603 0.0478
No log 50.0 350 0.6244 0.815 0.2780 1.3662 0.815 0.7974 0.1322 0.0480
No log 51.0 357 0.6176 0.81 0.2777 1.4307 0.81 0.7941 0.1258 0.0478
No log 52.0 364 0.6150 0.805 0.2774 1.4310 0.805 0.7896 0.1369 0.0477
No log 53.0 371 0.6164 0.81 0.2772 1.4298 0.81 0.7941 0.1391 0.0479
No log 54.0 378 0.6137 0.81 0.2766 1.4291 0.81 0.7928 0.1358 0.0474
No log 55.0 385 0.6163 0.81 0.2776 1.4298 0.81 0.7928 0.1278 0.0475
No log 56.0 392 0.6148 0.81 0.2776 1.4286 0.81 0.7928 0.1480 0.0471
No log 57.0 399 0.6154 0.81 0.2773 1.4290 0.81 0.7928 0.1485 0.0474
No log 58.0 406 0.6143 0.8 0.2781 1.4281 0.8000 0.7852 0.1405 0.0473
No log 59.0 413 0.6158 0.805 0.2785 1.4295 0.805 0.7899 0.1455 0.0473
No log 60.0 420 0.6146 0.805 0.2774 1.4310 0.805 0.7899 0.1346 0.0472
No log 61.0 427 0.6154 0.805 0.2780 1.4292 0.805 0.7899 0.1451 0.0472
No log 62.0 434 0.6148 0.805 0.2780 1.4304 0.805 0.7905 0.1543 0.0473
No log 63.0 441 0.6150 0.8 0.2783 1.4284 0.8000 0.7846 0.1502 0.0473
No log 64.0 448 0.6143 0.805 0.2780 1.4294 0.805 0.7899 0.1453 0.0470
No log 65.0 455 0.6152 0.805 0.2782 1.4298 0.805 0.7899 0.1373 0.0469
No log 66.0 462 0.6148 0.8 0.2781 1.4287 0.8000 0.7852 0.1492 0.0475
No log 67.0 469 0.6134 0.805 0.2776 1.4286 0.805 0.7899 0.1526 0.0470
No log 68.0 476 0.6150 0.8 0.2785 1.4270 0.8000 0.7846 0.1497 0.0474
No log 69.0 483 0.6145 0.8 0.2783 1.4281 0.8000 0.7846 0.1483 0.0471
No log 70.0 490 0.6145 0.805 0.2778 1.4292 0.805 0.7899 0.1472 0.0471
No log 71.0 497 0.6143 0.805 0.2779 1.4284 0.805 0.7899 0.1529 0.0470
0.2616 72.0 504 0.6148 0.805 0.2780 1.4276 0.805 0.7899 0.1414 0.0471
0.2616 73.0 511 0.6147 0.8 0.2781 1.4285 0.8000 0.7852 0.1400 0.0473
0.2616 74.0 518 0.6147 0.8 0.2783 1.4281 0.8000 0.7846 0.1501 0.0473
0.2616 75.0 525 0.6150 0.8 0.2784 1.4269 0.8000 0.7846 0.1417 0.0473
0.2616 76.0 532 0.6143 0.805 0.2782 1.4273 0.805 0.7899 0.1524 0.0470
0.2616 77.0 539 0.6147 0.805 0.2782 1.4277 0.805 0.7899 0.1526 0.0470
0.2616 78.0 546 0.6149 0.8 0.2785 1.4277 0.8000 0.7846 0.1572 0.0474
0.2616 79.0 553 0.6147 0.805 0.2782 1.4276 0.805 0.7899 0.1529 0.0471
0.2616 80.0 560 0.6145 0.805 0.2783 1.4278 0.805 0.7899 0.1527 0.0471
0.2616 81.0 567 0.6147 0.8 0.2783 1.4277 0.8000 0.7846 0.1483 0.0472
0.2616 82.0 574 0.6146 0.8 0.2783 1.4275 0.8000 0.7846 0.1623 0.0473
0.2616 83.0 581 0.6145 0.8 0.2783 1.4274 0.8000 0.7846 0.1571 0.0473
0.2616 84.0 588 0.6146 0.8 0.2782 1.4276 0.8000 0.7846 0.1538 0.0473
0.2616 85.0 595 0.6146 0.805 0.2783 1.4274 0.805 0.7899 0.1493 0.0471
0.2616 86.0 602 0.6147 0.8 0.2784 1.4269 0.8000 0.7846 0.1627 0.0473
0.2616 87.0 609 0.6146 0.8 0.2783 1.4270 0.8000 0.7846 0.1623 0.0472
0.2616 88.0 616 0.6145 0.805 0.2783 1.4272 0.805 0.7899 0.1579 0.0470
0.2616 89.0 623 0.6146 0.8 0.2784 1.4272 0.8000 0.7846 0.1627 0.0474
0.2616 90.0 630 0.6147 0.8 0.2783 1.4270 0.8000 0.7846 0.1536 0.0473
0.2616 91.0 637 0.6147 0.8 0.2784 1.4268 0.8000 0.7846 0.1627 0.0475
0.2616 92.0 644 0.6145 0.805 0.2783 1.4268 0.805 0.7899 0.1582 0.0471
0.2616 93.0 651 0.6145 0.8 0.2784 1.4269 0.8000 0.7846 0.1626 0.0474
0.2616 94.0 658 0.6146 0.8 0.2784 1.4268 0.8000 0.7846 0.1626 0.0473
0.2616 95.0 665 0.6147 0.8 0.2784 1.4268 0.8000 0.7846 0.1626 0.0473
0.2616 96.0 672 0.6146 0.8 0.2784 1.4269 0.8000 0.7846 0.1626 0.0474
0.2616 97.0 679 0.6146 0.8 0.2784 1.4269 0.8000 0.7846 0.1626 0.0474
0.2616 98.0 686 0.6146 0.8 0.2784 1.4269 0.8000 0.7846 0.1626 0.0474
0.2616 99.0 693 0.6146 0.8 0.2784 1.4268 0.8000 0.7846 0.1626 0.0474
0.2616 100.0 700 0.6146 0.8 0.2784 1.4268 0.8000 0.7846 0.1626 0.0474

Framework versions

  • Transformers 4.36.0.dev0
  • Pytorch 2.2.0.dev20231112+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
8
Safetensors
Model size
21.7M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for jordyvl/dit-base_tobacco-small_tobacco3482_kd_CEKD_t2.5_a0.5

Finetuned
(13)
this model