Skip to content

Latest commit

 

History

History
11 lines (9 loc) · 574 Bytes

README.md

File metadata and controls

11 lines (9 loc) · 574 Bytes

Attention-based Neural Machine Translation

This is an implementation of the attention mechanism used in "Effective Approaches to Attention-based Neural Machine Translation" by Minh-Thang Luong, Hieu Pham and Chistopher D. Manning. The paper can be found here.

The datasets can be downloaded from here. In order to run the models as is you will need to rename the dataset filenames according to the names found in main.py. You will also need to add the "<pad>" token to the vocab file.