02f96e631041e28f81661cdc92545008
This model is a fine-tuned version of google/mt5-large on the Helsinki-NLP/opus_books [es-fr] dataset. It achieves the following results on the evaluation set:
- Loss: 1.1543
- Data Size: 1.0
- Epoch Runtime: 563.5054
- Bleu: 14.6551
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 32
- total_eval_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 50
Training results
| Training Loss | Epoch | Step | Validation Loss | Data Size | Epoch Runtime | Bleu |
|---|---|---|---|---|---|---|
| No log | 0 | 0 | 25.1392 | 0 | 43.4763 | 0.0162 |
| No log | 1 | 1407 | 22.3288 | 0.0078 | 47.4389 | 0.0117 |
| No log | 2 | 2814 | 7.0068 | 0.0156 | 53.4653 | 0.0579 |
| 0.3836 | 3 | 4221 | 4.1310 | 0.0312 | 62.7488 | 0.2417 |
| 3.1645 | 4 | 5628 | 1.8861 | 0.0625 | 80.6597 | 1.2357 |
| 2.1444 | 5 | 7035 | 1.5645 | 0.125 | 113.7415 | 8.2465 |
| 1.8525 | 6 | 8442 | 1.4423 | 0.25 | 178.3995 | 9.7675 |
| 1.7316 | 7 | 9849 | 1.3651 | 0.5 | 308.0960 | 11.2873 |
| 1.5217 | 8.0 | 11256 | 1.2787 | 1.0 | 576.8079 | 12.5978 |
| 1.3982 | 9.0 | 12663 | 1.2206 | 1.0 | 563.1243 | 13.0822 |
| 1.3671 | 10.0 | 14070 | 1.1951 | 1.0 | 575.6702 | 13.5539 |
| 1.26 | 11.0 | 15477 | 1.1747 | 1.0 | 593.7553 | 13.6432 |
| 1.1869 | 12.0 | 16884 | 1.1576 | 1.0 | 565.6605 | 14.0193 |
| 1.1627 | 13.0 | 18291 | 1.1565 | 1.0 | 563.8033 | 14.1323 |
| 1.1296 | 14.0 | 19698 | 1.1461 | 1.0 | 562.7275 | 14.1882 |
| 1.0725 | 15.0 | 21105 | 1.1423 | 1.0 | 566.5551 | 14.4175 |
| 1.0496 | 16.0 | 22512 | 1.1379 | 1.0 | 564.1802 | 14.5031 |
| 0.9903 | 17.0 | 23919 | 1.1422 | 1.0 | 570.8221 | 14.4280 |
| 0.9572 | 18.0 | 25326 | 1.1473 | 1.0 | 564.4789 | 14.6660 |
| 0.923 | 19.0 | 26733 | 1.1616 | 1.0 | 564.9559 | 14.5670 |
| 0.8914 | 20.0 | 28140 | 1.1543 | 1.0 | 563.5054 | 14.6551 |
Framework versions
- Transformers 4.57.0
- Pytorch 2.8.0+cu128
- Datasets 4.2.0
- Tokenizers 0.22.1
- Downloads last month
- 1
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for contemmcm/02f96e631041e28f81661cdc92545008
Base model
google/mt5-large