ayubkfupm commited on
Commit
c752d44
1 Parent(s): 9e150aa

Model save

Browse files
README.md ADDED
@@ -0,0 +1,276 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: microsoft/swin-tiny-patch4-window7-224
4
+ tags:
5
+ - generated_from_trainer
6
+ datasets:
7
+ - imagefolder
8
+ metrics:
9
+ - accuracy
10
+ model-index:
11
+ - name: swin-tiny-patch4-window7-224-finetuned-st-ucihar
12
+ results:
13
+ - task:
14
+ name: Image Classification
15
+ type: image-classification
16
+ dataset:
17
+ name: imagefolder
18
+ type: imagefolder
19
+ config: default
20
+ split: train
21
+ args: default
22
+ metrics:
23
+ - name: Accuracy
24
+ type: accuracy
25
+ value: 0.8363553943789664
26
+ ---
27
+
28
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
29
+ should probably proofread and complete it, then remove this comment. -->
30
+
31
+ # swin-tiny-patch4-window7-224-finetuned-st-ucihar
32
+
33
+ This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
34
+ It achieves the following results on the evaluation set:
35
+ - Loss: 0.7803
36
+ - Accuracy: 0.8364
37
+
38
+ ## Model description
39
+
40
+ More information needed
41
+
42
+ ## Intended uses & limitations
43
+
44
+ More information needed
45
+
46
+ ## Training and evaluation data
47
+
48
+ More information needed
49
+
50
+ ## Training procedure
51
+
52
+ ### Training hyperparameters
53
+
54
+ The following hyperparameters were used during training:
55
+ - learning_rate: 5e-05
56
+ - train_batch_size: 32
57
+ - eval_batch_size: 32
58
+ - seed: 42
59
+ - gradient_accumulation_steps: 4
60
+ - total_train_batch_size: 128
61
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
62
+ - lr_scheduler_type: linear
63
+ - lr_scheduler_warmup_ratio: 0.1
64
+ - num_epochs: 200
65
+
66
+ ### Training results
67
+
68
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
69
+ |:-------------:|:--------:|:----:|:---------------:|:--------:|
70
+ | 1.6826 | 0.9938 | 40 | 1.6510 | 0.3504 |
71
+ | 1.3306 | 1.9876 | 80 | 1.2070 | 0.4710 |
72
+ | 1.0249 | 2.9814 | 120 | 0.9149 | 0.5512 |
73
+ | 0.8922 | 4.0 | 161 | 0.7313 | 0.6587 |
74
+ | 0.791 | 4.9938 | 201 | 0.6777 | 0.6881 |
75
+ | 0.7267 | 5.9876 | 241 | 0.6398 | 0.7026 |
76
+ | 0.7106 | 6.9814 | 281 | 0.6299 | 0.7035 |
77
+ | 0.7038 | 8.0 | 322 | 0.6062 | 0.7058 |
78
+ | 0.6694 | 8.9938 | 362 | 0.5736 | 0.7226 |
79
+ | 0.6489 | 9.9876 | 402 | 0.5858 | 0.7248 |
80
+ | 0.6587 | 10.9814 | 442 | 0.5437 | 0.7434 |
81
+ | 0.6376 | 12.0 | 483 | 0.5259 | 0.7520 |
82
+ | 0.6494 | 12.9938 | 523 | 0.5807 | 0.7307 |
83
+ | 0.6236 | 13.9876 | 563 | 0.5236 | 0.7430 |
84
+ | 0.5931 | 14.9814 | 603 | 0.5463 | 0.7384 |
85
+ | 0.605 | 16.0 | 644 | 0.5360 | 0.7371 |
86
+ | 0.5422 | 16.9938 | 684 | 0.4954 | 0.7788 |
87
+ | 0.5485 | 17.9876 | 724 | 0.4910 | 0.7756 |
88
+ | 0.5586 | 18.9814 | 764 | 0.5052 | 0.7738 |
89
+ | 0.6011 | 20.0 | 805 | 0.5036 | 0.7679 |
90
+ | 0.5726 | 20.9938 | 845 | 0.5308 | 0.7588 |
91
+ | 0.514 | 21.9876 | 885 | 0.5064 | 0.7720 |
92
+ | 0.5431 | 22.9814 | 925 | 0.4680 | 0.7892 |
93
+ | 0.5348 | 24.0 | 966 | 0.6083 | 0.7430 |
94
+ | 0.532 | 24.9938 | 1006 | 0.4889 | 0.7815 |
95
+ | 0.5263 | 25.9876 | 1046 | 0.5268 | 0.7634 |
96
+ | 0.497 | 26.9814 | 1086 | 0.5057 | 0.7779 |
97
+ | 0.4929 | 28.0 | 1127 | 0.5560 | 0.7570 |
98
+ | 0.4828 | 28.9938 | 1167 | 0.4701 | 0.7928 |
99
+ | 0.4604 | 29.9876 | 1207 | 0.4656 | 0.8001 |
100
+ | 0.4472 | 30.9814 | 1247 | 0.4619 | 0.8083 |
101
+ | 0.455 | 32.0 | 1288 | 0.4920 | 0.7969 |
102
+ | 0.4467 | 32.9938 | 1328 | 0.4698 | 0.7987 |
103
+ | 0.4327 | 33.9876 | 1368 | 0.4489 | 0.8055 |
104
+ | 0.3918 | 34.9814 | 1408 | 0.5249 | 0.7906 |
105
+ | 0.4222 | 36.0 | 1449 | 0.4380 | 0.8069 |
106
+ | 0.3864 | 36.9938 | 1489 | 0.4695 | 0.8128 |
107
+ | 0.3865 | 37.9876 | 1529 | 0.5208 | 0.7811 |
108
+ | 0.3759 | 38.9814 | 1569 | 0.4972 | 0.8019 |
109
+ | 0.375 | 40.0 | 1610 | 0.4667 | 0.8105 |
110
+ | 0.4016 | 40.9938 | 1650 | 0.4632 | 0.8178 |
111
+ | 0.3788 | 41.9876 | 1690 | 0.4670 | 0.8078 |
112
+ | 0.3517 | 42.9814 | 1730 | 0.4745 | 0.8101 |
113
+ | 0.3707 | 44.0 | 1771 | 0.4526 | 0.8187 |
114
+ | 0.3422 | 44.9938 | 1811 | 0.4893 | 0.8005 |
115
+ | 0.3661 | 45.9876 | 1851 | 0.4748 | 0.8087 |
116
+ | 0.3522 | 46.9814 | 1891 | 0.4923 | 0.7992 |
117
+ | 0.385 | 48.0 | 1932 | 0.4335 | 0.8273 |
118
+ | 0.306 | 48.9938 | 1972 | 0.4823 | 0.8228 |
119
+ | 0.3163 | 49.9876 | 2012 | 0.5157 | 0.8069 |
120
+ | 0.3199 | 50.9814 | 2052 | 0.5283 | 0.8015 |
121
+ | 0.3248 | 52.0 | 2093 | 0.4652 | 0.8191 |
122
+ | 0.2969 | 52.9938 | 2133 | 0.5260 | 0.8024 |
123
+ | 0.3138 | 53.9876 | 2173 | 0.4862 | 0.8069 |
124
+ | 0.2665 | 54.9814 | 2213 | 0.4781 | 0.8250 |
125
+ | 0.2932 | 56.0 | 2254 | 0.5094 | 0.8214 |
126
+ | 0.2819 | 56.9938 | 2294 | 0.4921 | 0.8178 |
127
+ | 0.2677 | 57.9876 | 2334 | 0.5372 | 0.8087 |
128
+ | 0.2623 | 58.9814 | 2374 | 0.4847 | 0.8286 |
129
+ | 0.2584 | 60.0 | 2415 | 0.5754 | 0.8069 |
130
+ | 0.2637 | 60.9938 | 2455 | 0.5297 | 0.8182 |
131
+ | 0.2391 | 61.9876 | 2495 | 0.5187 | 0.8214 |
132
+ | 0.2426 | 62.9814 | 2535 | 0.5719 | 0.8137 |
133
+ | 0.2405 | 64.0 | 2576 | 0.5118 | 0.8232 |
134
+ | 0.2132 | 64.9938 | 2616 | 0.5691 | 0.8123 |
135
+ | 0.2572 | 65.9876 | 2656 | 0.5452 | 0.8209 |
136
+ | 0.2255 | 66.9814 | 2696 | 0.5650 | 0.8073 |
137
+ | 0.2614 | 68.0 | 2737 | 0.5387 | 0.8214 |
138
+ | 0.2284 | 68.9938 | 2777 | 0.6056 | 0.8141 |
139
+ | 0.2371 | 69.9876 | 2817 | 0.5906 | 0.8128 |
140
+ | 0.2089 | 70.9814 | 2857 | 0.5550 | 0.8119 |
141
+ | 0.2276 | 72.0 | 2898 | 0.5511 | 0.8214 |
142
+ | 0.2192 | 72.9938 | 2938 | 0.6162 | 0.8259 |
143
+ | 0.2076 | 73.9876 | 2978 | 0.5663 | 0.8237 |
144
+ | 0.1938 | 74.9814 | 3018 | 0.6118 | 0.8191 |
145
+ | 0.2274 | 76.0 | 3059 | 0.5603 | 0.8268 |
146
+ | 0.2271 | 76.9938 | 3099 | 0.6312 | 0.8128 |
147
+ | 0.2023 | 77.9876 | 3139 | 0.6300 | 0.8123 |
148
+ | 0.1792 | 78.9814 | 3179 | 0.5776 | 0.8268 |
149
+ | 0.1796 | 80.0 | 3220 | 0.6266 | 0.8209 |
150
+ | 0.1994 | 80.9938 | 3260 | 0.5468 | 0.8228 |
151
+ | 0.1857 | 81.9876 | 3300 | 0.6080 | 0.8205 |
152
+ | 0.1636 | 82.9814 | 3340 | 0.7066 | 0.8160 |
153
+ | 0.1665 | 84.0 | 3381 | 0.6064 | 0.8277 |
154
+ | 0.183 | 84.9938 | 3421 | 0.6019 | 0.8273 |
155
+ | 0.1761 | 85.9876 | 3461 | 0.6420 | 0.8196 |
156
+ | 0.1673 | 86.9814 | 3501 | 0.6287 | 0.8255 |
157
+ | 0.1946 | 88.0 | 3542 | 0.6024 | 0.8228 |
158
+ | 0.1511 | 88.9938 | 3582 | 0.6774 | 0.8169 |
159
+ | 0.1828 | 89.9876 | 3622 | 0.6015 | 0.8255 |
160
+ | 0.1758 | 90.9814 | 3662 | 0.5969 | 0.8300 |
161
+ | 0.1797 | 92.0 | 3703 | 0.6464 | 0.8200 |
162
+ | 0.176 | 92.9938 | 3743 | 0.6287 | 0.8173 |
163
+ | 0.1616 | 93.9876 | 3783 | 0.6914 | 0.8209 |
164
+ | 0.1783 | 94.9814 | 3823 | 0.6511 | 0.8218 |
165
+ | 0.1492 | 96.0 | 3864 | 0.6382 | 0.8264 |
166
+ | 0.1578 | 96.9938 | 3904 | 0.6391 | 0.8241 |
167
+ | 0.1574 | 97.9876 | 3944 | 0.6505 | 0.8255 |
168
+ | 0.1556 | 98.9814 | 3984 | 0.6302 | 0.8241 |
169
+ | 0.1396 | 100.0 | 4025 | 0.6634 | 0.8155 |
170
+ | 0.1246 | 100.9938 | 4065 | 0.6633 | 0.8264 |
171
+ | 0.1592 | 101.9876 | 4105 | 0.6815 | 0.8160 |
172
+ | 0.1393 | 102.9814 | 4145 | 0.6418 | 0.8237 |
173
+ | 0.1722 | 104.0 | 4186 | 0.6322 | 0.8318 |
174
+ | 0.1499 | 104.9938 | 4226 | 0.6901 | 0.8196 |
175
+ | 0.1282 | 105.9876 | 4266 | 0.6544 | 0.8309 |
176
+ | 0.1428 | 106.9814 | 4306 | 0.6581 | 0.8291 |
177
+ | 0.1478 | 108.0 | 4347 | 0.6825 | 0.8291 |
178
+ | 0.1453 | 108.9938 | 4387 | 0.6873 | 0.8237 |
179
+ | 0.1216 | 109.9876 | 4427 | 0.7075 | 0.8223 |
180
+ | 0.1449 | 110.9814 | 4467 | 0.6929 | 0.8232 |
181
+ | 0.137 | 112.0 | 4508 | 0.7139 | 0.8205 |
182
+ | 0.1177 | 112.9938 | 4548 | 0.6981 | 0.8305 |
183
+ | 0.1005 | 113.9876 | 4588 | 0.6840 | 0.8205 |
184
+ | 0.1305 | 114.9814 | 4628 | 0.6747 | 0.8273 |
185
+ | 0.1192 | 116.0 | 4669 | 0.6886 | 0.8259 |
186
+ | 0.1067 | 116.9938 | 4709 | 0.6612 | 0.8209 |
187
+ | 0.1122 | 117.9876 | 4749 | 0.6500 | 0.8259 |
188
+ | 0.1295 | 118.9814 | 4789 | 0.6948 | 0.8232 |
189
+ | 0.1304 | 120.0 | 4830 | 0.6651 | 0.8309 |
190
+ | 0.1334 | 120.9938 | 4870 | 0.7304 | 0.8187 |
191
+ | 0.1104 | 121.9876 | 4910 | 0.7365 | 0.8205 |
192
+ | 0.1132 | 122.9814 | 4950 | 0.7270 | 0.8300 |
193
+ | 0.1115 | 124.0 | 4991 | 0.7062 | 0.8228 |
194
+ | 0.1079 | 124.9938 | 5031 | 0.7579 | 0.8268 |
195
+ | 0.1192 | 125.9876 | 5071 | 0.7321 | 0.8205 |
196
+ | 0.0994 | 126.9814 | 5111 | 0.7219 | 0.8291 |
197
+ | 0.111 | 128.0 | 5152 | 0.7064 | 0.8273 |
198
+ | 0.1089 | 128.9938 | 5192 | 0.7056 | 0.8282 |
199
+ | 0.1062 | 129.9876 | 5232 | 0.6814 | 0.8323 |
200
+ | 0.1046 | 130.9814 | 5272 | 0.6843 | 0.8309 |
201
+ | 0.1013 | 132.0 | 5313 | 0.6807 | 0.8327 |
202
+ | 0.0879 | 132.9938 | 5353 | 0.7080 | 0.8336 |
203
+ | 0.1114 | 133.9876 | 5393 | 0.7129 | 0.8241 |
204
+ | 0.1133 | 134.9814 | 5433 | 0.7376 | 0.8264 |
205
+ | 0.1067 | 136.0 | 5474 | 0.7579 | 0.8259 |
206
+ | 0.1104 | 136.9938 | 5514 | 0.7178 | 0.8291 |
207
+ | 0.0893 | 137.9876 | 5554 | 0.7315 | 0.8300 |
208
+ | 0.1074 | 138.9814 | 5594 | 0.7312 | 0.8318 |
209
+ | 0.0983 | 140.0 | 5635 | 0.7362 | 0.8286 |
210
+ | 0.1093 | 140.9938 | 5675 | 0.7493 | 0.8286 |
211
+ | 0.1166 | 141.9876 | 5715 | 0.7205 | 0.8286 |
212
+ | 0.0969 | 142.9814 | 5755 | 0.7494 | 0.8291 |
213
+ | 0.1174 | 144.0 | 5796 | 0.6960 | 0.8336 |
214
+ | 0.1044 | 144.9938 | 5836 | 0.7111 | 0.8282 |
215
+ | 0.0866 | 145.9876 | 5876 | 0.7152 | 0.8364 |
216
+ | 0.092 | 146.9814 | 5916 | 0.7078 | 0.8327 |
217
+ | 0.0883 | 148.0 | 5957 | 0.7182 | 0.8341 |
218
+ | 0.0824 | 148.9938 | 5997 | 0.7095 | 0.8359 |
219
+ | 0.0953 | 149.9876 | 6037 | 0.7324 | 0.8354 |
220
+ | 0.0896 | 150.9814 | 6077 | 0.7032 | 0.8400 |
221
+ | 0.1025 | 152.0 | 6118 | 0.6938 | 0.8323 |
222
+ | 0.0966 | 152.9938 | 6158 | 0.6991 | 0.8404 |
223
+ | 0.0891 | 153.9876 | 6198 | 0.7346 | 0.8354 |
224
+ | 0.0733 | 154.9814 | 6238 | 0.7340 | 0.8350 |
225
+ | 0.0944 | 156.0 | 6279 | 0.7525 | 0.8277 |
226
+ | 0.0934 | 156.9938 | 6319 | 0.7683 | 0.8305 |
227
+ | 0.0768 | 157.9876 | 6359 | 0.7692 | 0.8286 |
228
+ | 0.0918 | 158.9814 | 6399 | 0.7387 | 0.8413 |
229
+ | 0.0886 | 160.0 | 6440 | 0.7705 | 0.8327 |
230
+ | 0.0836 | 160.9938 | 6480 | 0.7491 | 0.8327 |
231
+ | 0.0968 | 161.9876 | 6520 | 0.7663 | 0.8246 |
232
+ | 0.0748 | 162.9814 | 6560 | 0.7460 | 0.8305 |
233
+ | 0.0696 | 164.0 | 6601 | 0.7491 | 0.8332 |
234
+ | 0.0853 | 164.9938 | 6641 | 0.7788 | 0.8327 |
235
+ | 0.0726 | 165.9876 | 6681 | 0.7440 | 0.8382 |
236
+ | 0.0715 | 166.9814 | 6721 | 0.7518 | 0.8373 |
237
+ | 0.0699 | 168.0 | 6762 | 0.7574 | 0.8354 |
238
+ | 0.0749 | 168.9938 | 6802 | 0.7564 | 0.8323 |
239
+ | 0.0842 | 169.9876 | 6842 | 0.7829 | 0.8286 |
240
+ | 0.0822 | 170.9814 | 6882 | 0.7753 | 0.8327 |
241
+ | 0.0807 | 172.0 | 6923 | 0.7611 | 0.8359 |
242
+ | 0.0752 | 172.9938 | 6963 | 0.7673 | 0.8345 |
243
+ | 0.075 | 173.9876 | 7003 | 0.7815 | 0.8364 |
244
+ | 0.0845 | 174.9814 | 7043 | 0.7745 | 0.8382 |
245
+ | 0.0827 | 176.0 | 7084 | 0.7683 | 0.8373 |
246
+ | 0.0883 | 176.9938 | 7124 | 0.7842 | 0.8327 |
247
+ | 0.0774 | 177.9876 | 7164 | 0.7736 | 0.8368 |
248
+ | 0.0817 | 178.9814 | 7204 | 0.7852 | 0.8341 |
249
+ | 0.0804 | 180.0 | 7245 | 0.7686 | 0.8314 |
250
+ | 0.0671 | 180.9938 | 7285 | 0.7767 | 0.8359 |
251
+ | 0.076 | 181.9876 | 7325 | 0.7715 | 0.8350 |
252
+ | 0.0572 | 182.9814 | 7365 | 0.7740 | 0.8286 |
253
+ | 0.0823 | 184.0 | 7406 | 0.7757 | 0.8341 |
254
+ | 0.0662 | 184.9938 | 7446 | 0.7720 | 0.8336 |
255
+ | 0.0805 | 185.9876 | 7486 | 0.7696 | 0.8368 |
256
+ | 0.0763 | 186.9814 | 7526 | 0.7768 | 0.8377 |
257
+ | 0.0711 | 188.0 | 7567 | 0.7720 | 0.8350 |
258
+ | 0.0576 | 188.9938 | 7607 | 0.7845 | 0.8314 |
259
+ | 0.0667 | 189.9876 | 7647 | 0.7749 | 0.8336 |
260
+ | 0.0631 | 190.9814 | 7687 | 0.7774 | 0.8350 |
261
+ | 0.0744 | 192.0 | 7728 | 0.7778 | 0.8327 |
262
+ | 0.0672 | 192.9938 | 7768 | 0.7862 | 0.8323 |
263
+ | 0.0738 | 193.9876 | 7808 | 0.7843 | 0.8345 |
264
+ | 0.0754 | 194.9814 | 7848 | 0.7850 | 0.8368 |
265
+ | 0.0887 | 196.0 | 7889 | 0.7835 | 0.8364 |
266
+ | 0.0898 | 196.9938 | 7929 | 0.7810 | 0.8373 |
267
+ | 0.0543 | 197.9876 | 7969 | 0.7801 | 0.8364 |
268
+ | 0.0605 | 198.7578 | 8000 | 0.7803 | 0.8364 |
269
+
270
+
271
+ ### Framework versions
272
+
273
+ - Transformers 4.44.0
274
+ - Pytorch 1.12.1+cu113
275
+ - Datasets 2.21.0
276
+ - Tokenizers 0.19.1
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:46a76a83a3cfb334c7f0c65ee6b8a255648c95d549f97d6a9f76f90860fdfd69
3
  size 110355136
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:37bbdc80cb88e7c5896e4264ae4859bc93f29b8007596a030038fcadae116097
3
  size 110355136
runs/Aug28_17-53-01_jrcai27/events.out.tfevents.1724856787.jrcai27.14540.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f8a203cd3569f8a9963d499e16830f46eb823e590d5101142756447b16978abc
3
- size 238208
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:db0372de5b736d2719f4b62b222e279ecdad00f99f1ddf85a8e395aec68daa2d
3
+ size 238885