Edit model card

Visualize in Weights & Biases

wav2vec2-large-ln-10hr-v1

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5339
  • Model Preparation Time: 0.0079
  • Wer: 0.2025
  • Cer: 0.0593

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Model Preparation Time Wer Cer
12.3976 0.9956 112 4.7441 0.0079 1.0 1.0
3.5953 2.0 225 2.9942 0.0079 1.0 1.0
2.8828 2.9956 337 2.7593 0.0079 1.0 1.0
2.5824 4.0 450 1.6271 0.0079 1.0 0.4698
0.9071 4.9956 562 0.5144 0.0079 0.4170 0.1076
0.5255 6.0 675 0.3454 0.0079 0.2987 0.0815
0.3912 6.9956 787 0.2978 0.0079 0.2514 0.0704
0.309 8.0 900 0.2725 0.0079 0.2530 0.0682
0.258 8.9956 1012 0.2571 0.0079 0.2393 0.0645
0.2186 10.0 1125 0.2590 0.0079 0.2319 0.0618
0.1888 10.9956 1237 0.2425 0.0079 0.2171 0.0589
0.1575 12.0 1350 0.2335 0.0079 0.2085 0.0545
0.1427 12.9956 1462 0.2418 0.0079 0.2013 0.0554
0.1242 14.0 1575 0.2564 0.0079 0.1966 0.0533
0.1137 14.9956 1687 0.2448 0.0079 0.1865 0.0512
0.1044 16.0 1800 0.2428 0.0079 0.1957 0.0539
0.0901 16.9956 1912 0.2610 0.0079 0.1913 0.0519
0.085 18.0 2025 0.2505 0.0079 0.1807 0.0495
0.0789 18.9956 2137 0.2709 0.0079 0.1871 0.0522
0.0703 20.0 2250 0.2533 0.0079 0.1820 0.0505
0.0674 20.9956 2362 0.2647 0.0079 0.1792 0.0491
0.0625 22.0 2475 0.2555 0.0079 0.1772 0.0504
0.0596 22.9956 2587 0.2550 0.0079 0.1795 0.0496
0.0539 24.0 2700 0.2657 0.0079 0.1797 0.0509
0.0538 24.9956 2812 0.2602 0.0079 0.1913 0.0516
0.0537 26.0 2925 0.2546 0.0079 0.1740 0.0475
0.0471 26.9956 3037 0.2634 0.0079 0.1813 0.0499
0.0447 28.0 3150 0.2699 0.0079 0.1751 0.0480
0.0425 28.9956 3262 0.2721 0.0079 0.1701 0.0470
0.0404 30.0 3375 0.2705 0.0079 0.1733 0.0484
0.0372 30.9956 3487 0.2754 0.0079 0.1749 0.0482
0.0345 32.0 3600 0.2923 0.0079 0.1747 0.0480
0.0328 32.9956 3712 0.2876 0.0079 0.1680 0.0473
0.034 34.0 3825 0.2867 0.0079 0.1704 0.0479
0.032 34.9956 3937 0.2732 0.0079 0.1626 0.0450
0.0306 36.0 4050 0.2827 0.0079 0.1632 0.0456
0.0338 36.9956 4162 0.2774 0.0079 0.1707 0.0467
0.0305 38.0 4275 0.2900 0.0079 0.1726 0.0475
0.0299 38.9956 4387 0.2825 0.0079 0.1614 0.0459
0.0268 40.0 4500 0.2827 0.0079 0.1599 0.0449
0.0275 40.9956 4612 0.3024 0.0079 0.1599 0.0449
0.0244 42.0 4725 0.2887 0.0079 0.1594 0.0441
0.0208 42.9956 4837 0.2908 0.0079 0.1587 0.0441
0.02 44.0 4950 0.2938 0.0079 0.1595 0.0447
0.0216 44.9956 5062 0.2907 0.0079 0.1563 0.0443
0.0213 46.0 5175 0.2965 0.0079 0.1608 0.0444
0.0202 46.9956 5287 0.2920 0.0079 0.1577 0.0435
0.0201 48.0 5400 0.3040 0.0079 0.1631 0.0451
0.021 48.9956 5512 0.2833 0.0079 0.1594 0.0446
0.0212 50.0 5625 0.2892 0.0079 0.1547 0.0428
0.0183 50.9956 5737 0.2885 0.0079 0.1549 0.0431
0.019 52.0 5850 0.2776 0.0079 0.1586 0.0444
0.0189 52.9956 5962 0.2799 0.0079 0.1541 0.0423
0.0187 54.0 6075 0.3060 0.0079 0.1540 0.0429
0.0157 54.9956 6187 0.2955 0.0079 0.1505 0.0424
0.0173 56.0 6300 0.2911 0.0079 0.1551 0.0427
0.0153 56.9956 6412 0.2895 0.0079 0.1595 0.0434
0.0161 58.0 6525 0.2899 0.0079 0.1518 0.0417
0.0149 58.9956 6637 0.2862 0.0079 0.1526 0.0421
0.0154 60.0 6750 0.2953 0.0079 0.1467 0.0412
0.014 60.9956 6862 0.2970 0.0079 0.1487 0.0417
0.011 62.0 6975 0.3068 0.0079 0.1481 0.0418
0.0137 62.9956 7087 0.3010 0.0079 0.1507 0.0422
0.0122 64.0 7200 0.2963 0.0079 0.1508 0.0419
0.0118 64.9956 7312 0.2989 0.0079 0.1524 0.0421
0.0116 66.0 7425 0.3032 0.0079 0.1492 0.0410
0.0111 66.9956 7537 0.3148 0.0079 0.1470 0.0417
0.0112 68.0 7650 0.3003 0.0079 0.1495 0.0415
0.0103 68.9956 7762 0.3057 0.0079 0.1497 0.0415
0.0096 70.0 7875 0.3050 0.0079 0.1467 0.0409
0.0097 70.9956 7987 0.3085 0.0079 0.1475 0.0408
0.0091 72.0 8100 0.3008 0.0079 0.1422 0.0404
0.0077 72.9956 8212 0.3051 0.0079 0.1437 0.0405
0.0091 74.0 8325 0.3055 0.0079 0.1445 0.0410
0.008 74.9956 8437 0.3016 0.0079 0.1444 0.0407
0.0098 76.0 8550 0.3001 0.0079 0.1422 0.0403
0.0092 76.9956 8662 0.3062 0.0079 0.1439 0.0403
0.0074 78.0 8775 0.3063 0.0079 0.1428 0.0401
0.0083 78.9956 8887 0.3064 0.0079 0.1448 0.0407
0.0076 80.0 9000 0.3033 0.0079 0.1436 0.0402
0.0078 80.9956 9112 0.3058 0.0079 0.1429 0.0402
0.0078 82.0 9225 0.3078 0.0079 0.1416 0.0399
0.0071 82.9956 9337 0.3098 0.0079 0.1438 0.0402
0.0075 84.0 9450 0.3101 0.0079 0.1462 0.0405
0.0071 84.9956 9562 0.3073 0.0079 0.1457 0.0404
0.0086 86.0 9675 0.3063 0.0079 0.1436 0.0402
0.0068 86.9956 9787 0.3058 0.0079 0.1432 0.0402
0.0069 88.0 9900 0.3037 0.0079 0.1434 0.0401
0.0073 88.9956 10012 0.3053 0.0079 0.1425 0.0400
0.0073 90.0 10125 0.3050 0.0079 0.1425 0.0398
0.0063 90.9956 10237 0.3074 0.0079 0.1436 0.0400
0.0072 92.0 10350 0.3067 0.0079 0.1436 0.0400
0.0066 92.9956 10462 0.3065 0.0079 0.1432 0.0399
0.0065 94.0 10575 0.3069 0.0079 0.1437 0.0400
0.0075 94.9956 10687 0.3068 0.0079 0.1430 0.0399
0.0064 96.0 10800 0.3068 0.0079 0.1430 0.0398
0.0064 96.9956 10912 0.3070 0.0079 0.1434 0.0399
0.0067 98.0 11025 0.3069 0.0079 0.1431 0.0398
0.0064 98.9956 11137 0.3069 0.0079 0.1434 0.0399
0.0076 99.5556 11200 0.3069 0.0079 0.1434 0.0399

Framework versions

  • Transformers 4.43.1
  • Pytorch 2.2.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for KasuleTrevor/wav2vec2-large-ln-10hr-v1

Finetuned
(406)
this model