- dataset: opus
- model: transformer
- source language(s): bul bul_Latn mkd slv
- target language(s): eng
- model: transformer
- pre-processing: normalization + SentencePiece (spm32k,spm32k)
- download: opus-2020-06-28.zip
- test set translations: opus-2020-06-28.test.txt
- test set scores: opus-2020-06-28.eval.txt
testset | BLEU | chr-F |
---|---|---|
Tatoeba-test.bul-eng.bul.eng | 54.7 | 0.693 |
Tatoeba-test.mkd-eng.mkd.eng | 54.0 | 0.676 |
Tatoeba-test.multi.eng | 52.8 | 0.671 |
Tatoeba-test.slv-eng.slv.eng | 25.3 | 0.410 |
- dataset: opus
- model: transformer
- source language(s): bul bul_Latn mkd slv
- target language(s): eng
- model: transformer
- pre-processing: normalization + SentencePiece (spm32k,spm32k)
- download: opus-2020-07-04.zip
- test set translations: opus-2020-07-04.test.txt
- test set scores: opus-2020-07-04.eval.txt
testset | BLEU | chr-F |
---|---|---|
Tatoeba-test.bul-eng.bul.eng | 53.9 | 0.686 |
Tatoeba-test.mkd-eng.mkd.eng | 52.5 | 0.662 |
Tatoeba-test.multi.eng | 50.5 | 0.648 |
Tatoeba-test.slv-eng.slv.eng | 21.8 | 0.374 |
- dataset: opus
- model: transformer
- source language(s): bos_Latn bul bul_Latn hrv mkd slv srp_Cyrl srp_Latn
- target language(s): eng
- model: transformer
- pre-processing: normalization + SentencePiece (spm32k,spm32k)
- download: opus-2020-07-27.zip
- test set translations: opus-2020-07-27.test.txt
- test set scores: opus-2020-07-27.eval.txt
testset | BLEU | chr-F |
---|---|---|
Tatoeba-test.bul-eng.bul.eng | 53.9 | 0.686 |
Tatoeba-test.hbs-eng.hbs.eng | 54.8 | 0.693 |
Tatoeba-test.mkd-eng.mkd.eng | 53.4 | 0.672 |
Tatoeba-test.multi.eng | 52.5 | 0.668 |
Tatoeba-test.slv-eng.slv.eng | 24.9 | 0.405 |
- dataset: opus2m
- model: transformer
- source language(s): bos_Latn bul bul_Latn hrv mkd slv srp_Cyrl srp_Latn
- target language(s): eng
- model: transformer
- pre-processing: normalization + SentencePiece (spm32k,spm32k)
- download: opus2m-2020-08-01.zip
- test set translations: opus2m-2020-08-01.test.txt
- test set scores: opus2m-2020-08-01.eval.txt
testset | BLEU | chr-F |
---|---|---|
Tatoeba-test.bul-eng.bul.eng | 54.9 | 0.693 |
Tatoeba-test.hbs-eng.hbs.eng | 55.7 | 0.700 |
Tatoeba-test.mkd-eng.mkd.eng | 54.6 | 0.681 |
Tatoeba-test.multi.eng | 53.6 | 0.676 |
Tatoeba-test.slv-eng.slv.eng | 25.6 | 0.407 |
- dataset: opus4m
- model: transformer
- source language(s): bos_Latn bul bul_Latn hrv mkd slv srp_Cyrl srp_Latn
- target language(s): eng
- model: transformer
- pre-processing: normalization + SentencePiece (spm32k,spm32k)
- download: opus4m-2020-08-12.zip
- test set translations: opus4m-2020-08-12.test.txt
- test set scores: opus4m-2020-08-12.eval.txt
testset | BLEU | chr-F |
---|---|---|
Tatoeba-test.bul-eng.bul.eng | 55.4 | 0.697 |
Tatoeba-test.hbs-eng.hbs.eng | 55.6 | 0.701 |
Tatoeba-test.mkd-eng.mkd.eng | 54.7 | 0.682 |
Tatoeba-test.multi.eng | 53.8 | 0.677 |
Tatoeba-test.slv-eng.slv.eng | 25.0 | 0.408 |