Efik ↔ English Translation Model

This model provides machine translation between English and Efik. It was fine-tuned on 18k+ parallel sentences using the NLLB architecture and can be used for both direct translation and integration into multilingual NLP pipelines.

Uses

  • Translate text between English and Efik.
  • Assist in educational or localization projects involving Efik.
  • Support research in low-resource language NLP.

Limitations

  • Due to limited data, performance may decrease for long, complex, or domain-specific text.

How to Get Started

from transformers import AutoModelForSeq2SeqLM, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("offiongbassey/efik-mt")
model = AutoModelForSeq2SeqLM.from_pretrained("offiongbassey/efik-mt")

# English β†’ Efik
text = "My child is very sick and I need to take him to the hospital for treatment."
inputs = tokenizer(f"eng_Latn {text}", return_tensors="pt")
outputs = model.generate(**inputs, max_length=128)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

# Efik β†’ English
text = "Okon ama adaha utom tọñọ usenubọk."
inputs = tokenizer(f"ibo_Latn {text}", return_tensors="pt")
outputs = model.generate(**inputs, max_length=128)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

Training Details

  • Architecture: NLLB
  • Epochs trained: 8
  • Learning Rate: 5e-05
  • BLEU Scores:
    • EN β†’ EF: 29.58
    • EF β†’ EN: 32.14
  • chrF:
    • EN β†’ EF: 54.29
    • EF β†’ EN: 48.78
Downloads last month
28
Safetensors
Model size
0.6B params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for offiongbassey/efik-mt

Finetuned
(221)
this model

Space using offiongbassey/efik-mt 1