Skip to content

Latest commit

 

History

History
 
 

art-eng

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 

opus-2020-06-28.zip

  • dataset: opus
  • model: transformer
  • source language(s): afh_Latn avk_Latn bzt_Latn dws_Latn epo ido ido_Latn ile_Latn ina_Latn jbo jbo_Cyrl jbo_Latn ldn_Latn lfn_Cyrl lfn_Latn nov_Latn qya qya_Latn sjn_Latn tlh_Latn tzl tzl_Latn vol_Latn
  • target language(s): eng
  • model: transformer
  • pre-processing: normalization + SentencePiece (spm32k,spm32k)
  • download: opus-2020-06-28.zip
  • test set translations: opus-2020-06-28.test.txt
  • test set scores: opus-2020-06-28.eval.txt

Benchmarks

testset BLEU chr-F
Tatoeba-test.afh-eng.afh.eng 1.1 0.097
Tatoeba-test.avk-eng.avk.eng 0.6 0.108
Tatoeba-test.bzt-eng.bzt.eng 0.8 0.109
Tatoeba-test.dws-eng.dws.eng 0.7 0.039
Tatoeba-test.epo-eng.epo.eng 34.7 0.529
Tatoeba-test.ido-eng.ido.eng 13.8 0.318
Tatoeba-test.ile-eng.ile.eng 5.7 0.234
Tatoeba-test.ina-eng.ina.eng 5.7 0.251
Tatoeba-test.jbo-eng.jbo.eng 0.2 0.113
Tatoeba-test.ldn-eng.ldn.eng 0.3 0.082
Tatoeba-test.lfn-eng.lfn.eng 1.5 0.169
Tatoeba-test.multi.eng 11.9 0.291
Tatoeba-test.nov-eng.nov.eng 3.9 0.209
Tatoeba-test.qya-eng.qya.eng 0.3 0.076
Tatoeba-test.sjn-eng.sjn.eng 1.0 0.081
Tatoeba-test.tlh-eng.tlh.eng 0.2 0.124
Tatoeba-test.tzl-eng.tzl.eng 1.1 0.125
Tatoeba-test.vol-eng.vol.eng 0.6 0.115

opus-2020-07-26.zip

  • dataset: opus
  • model: transformer
  • source language(s): afh_Latn avk_Latn dws_Latn epo ido ido_Latn ile_Latn ina_Latn jbo jbo_Cyrl jbo_Latn ldn_Latn lfn_Cyrl lfn_Latn nov_Latn qya qya_Latn sjn_Latn tlh_Latn tzl tzl_Latn vol_Latn
  • target language(s): eng
  • model: transformer
  • pre-processing: normalization + SentencePiece (spm32k,spm32k)
  • download: opus-2020-07-26.zip
  • test set translations: opus-2020-07-26.test.txt
  • test set scores: opus-2020-07-26.eval.txt

Benchmarks

testset BLEU chr-F
Tatoeba-test.afh-eng.afh.eng 1.5 0.084
Tatoeba-test.avk-eng.avk.eng 0.4 0.104
Tatoeba-test.dws-eng.dws.eng 0.5 0.060
Tatoeba-test.epo-eng.epo.eng 34.8 0.529
Tatoeba-test.ido-eng.ido.eng 13.0 0.310
Tatoeba-test.ile-eng.ile.eng 5.2 0.227
Tatoeba-test.ina-eng.ina.eng 5.5 0.250
Tatoeba-test.jbo-eng.jbo.eng 0.2 0.111
Tatoeba-test.ldn-eng.ldn.eng 0.3 0.075
Tatoeba-test.lfn-eng.lfn.eng 1.8 0.171
Tatoeba-test.multi.eng 11.6 0.284
Tatoeba-test.nov-eng.nov.eng 4.2 0.210
Tatoeba-test.qya-eng.qya.eng 0.3 0.099
Tatoeba-test.sjn-eng.sjn.eng 0.5 0.091
Tatoeba-test.tlh-eng.tlh.eng 0.2 0.124
Tatoeba-test.tzl-eng.tzl.eng 1.0 0.115
Tatoeba-test.vol-eng.vol.eng 0.6 0.118

opus2m-2020-08-12.zip

  • dataset: opus2m
  • model: transformer
  • source language(s): afh_Latn avk_Latn dws_Latn epo ido ido_Latn ile_Latn ina_Latn jbo jbo_Cyrl jbo_Latn ldn_Latn lfn_Cyrl lfn_Latn nov_Latn qya qya_Latn sjn_Latn tlh_Latn tzl tzl_Latn vol_Latn
  • target language(s): eng
  • model: transformer
  • pre-processing: normalization + SentencePiece (spm32k,spm32k)
  • download: opus2m-2020-08-12.zip
  • test set translations: opus2m-2020-08-12.test.txt
  • test set scores: opus2m-2020-08-12.eval.txt

Benchmarks

testset BLEU chr-F
Tatoeba-test.afh-eng.afh.eng 1.2 0.099
Tatoeba-test.avk-eng.avk.eng 0.4 0.105
Tatoeba-test.dws-eng.dws.eng 1.6 0.076
Tatoeba-test.epo-eng.epo.eng 34.6 0.530
Tatoeba-test.ido-eng.ido.eng 12.7 0.310
Tatoeba-test.ile-eng.ile.eng 4.6 0.218
Tatoeba-test.ina-eng.ina.eng 5.8 0.254
Tatoeba-test.jbo-eng.jbo.eng 0.2 0.115
Tatoeba-test.ldn-eng.ldn.eng 0.7 0.083
Tatoeba-test.lfn-eng.lfn.eng 1.8 0.172
Tatoeba-test.multi.eng 11.6 0.287
Tatoeba-test.nov-eng.nov.eng 5.1 0.215
Tatoeba-test.qya-eng.qya.eng 0.7 0.113
Tatoeba-test.sjn-eng.sjn.eng 0.9 0.090
Tatoeba-test.tlh-eng.tlh.eng 0.2 0.124
Tatoeba-test.tzl-eng.tzl.eng 1.4 0.109
Tatoeba-test.vol-eng.vol.eng 0.5 0.115