minhtriphan
commited on
Commit
•
c4b66fa
1
Parent(s):
1563254
Update README.md
Browse files
README.md
CHANGED
@@ -37,9 +37,9 @@ The experiments are implemented with an NVIDIA A100-SXM4-40GB. Batch size of 1.
|
|
37 |
https://github.com/minhtriphan/LongFinBERT-base/tree/main
|
38 |
|
39 |
# Training configuration
|
40 |
-
* The
|
41 |
* The masking probability is 15%;
|
42 |
-
* Details about the training configuration are given in the log
|
43 |
|
44 |
# Versions
|
45 |
There are 2 versions of the pre-trained model,
|
|
|
37 |
https://github.com/minhtriphan/LongFinBERT-base/tree/main
|
38 |
|
39 |
# Training configuration
|
40 |
+
* The models are trained with 8 epochs using the Masked Language Modeling (MLM) task;
|
41 |
* The masking probability is 15%;
|
42 |
+
* Details about the training configuration are given in the log files named `train_v1_0803_1144_seed_1.log` and `train_v2_0831_1829_seed_1.log`;
|
43 |
|
44 |
# Versions
|
45 |
There are 2 versions of the pre-trained model,
|