Af-En_update_pc
This model is a fine-tuned version of Helsinki-NLP/opus-mt-af-en on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.8319
- Model Preparation Time: 0.0
- Bleu: 50.6655
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 15
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Bleu |
|---|---|---|---|---|---|
| 1.5372 | 1.0 | 5105 | 1.9631 | 0.0 | 36.5658 |
| 1.2529 | 2.0 | 10210 | 1.8745 | 0.0 | 46.3922 |
| 1.0324 | 3.0 | 15315 | 1.7872 | 0.0 | 47.6135 |
| 0.8459 | 4.0 | 20420 | 1.6956 | 0.0 | 49.7908 |
| 0.6545 | 5.0 | 25525 | 1.6746 | 0.0 | 50.3903 |
| 0.5743 | 6.0 | 30630 | 1.6951 | 0.0 | 50.8569 |
| 0.4897 | 7.0 | 35735 | 1.6863 | 0.0 | 50.6211 |
| 0.4328 | 8.0 | 40840 | 1.7113 | 0.0 | 50.6402 |
| 0.3955 | 9.0 | 45945 | 1.7482 | 0.0 | 50.7567 |
| 0.3565 | 10.0 | 51050 | 1.7655 | 0.0 | 50.8257 |
| 0.3275 | 11.0 | 56155 | 1.7988 | 0.0 | 50.5494 |
| 0.2696 | 12.0 | 61260 | 1.8035 | 0.0 | 50.6981 |
| 0.2635 | 13.0 | 66365 | 1.8194 | 0.0 | 50.6104 |
| 0.2526 | 14.0 | 71470 | 1.8190 | 0.0 | 50.6222 |
| 0.2338 | 15.0 | 76575 | 1.8319 | 0.0 | 50.6658 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 4
Model tree for kabelomalapane/Af-En_update_pc
Base model
Helsinki-NLP/opus-mt-af-en