Zero

A neural machine translation system implemented by python2 + tensorflow.

Features

  1. Multi-Process Data Loading/Processing (Problems Exist)
  2. Multi-GPU Training/Decoding
  3. Gradient Aggregation

Papers

Supported Models

Requirements

Usage

How to use this toolkit for machine translation?

TODO:

  1. organize the parameters and interpretations in config.
  2. reformat and fulfill code comments
  3. simplify and remove unecessary coding
  4. improve rnn models

Citation

If you use the source code, please consider citing the follow paper:

@InProceedings{D18-1459,
  author =  "Zhang, Biao
        and Xiong, Deyi
        and su, jinsong
        and Lin, Qian
        and Zhang, Huiji",
  title =   "Simplifying Neural Machine Translation with Addition-Subtraction Twin-Gated Recurrent Networks",
  booktitle =   "Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing",
  year =    "2018",
  publisher =   "Association for Computational Linguistics",
  pages =   "4273--4283",
  location =    "Brussels, Belgium",
  url =     "http://aclweb.org/anthology/D18-1459"
}

If you are interested in the CAEncoder model, please consider citing our TASLP paper:

@article{Zhang:2017:CRE:3180104.3180106,
 author = {Zhang, Biao and Xiong, Deyi and Su, Jinsong and Duan, Hong},
 title = {A Context-Aware Recurrent Encoder for Neural Machine Translation},
 journal = {IEEE/ACM Trans. Audio, Speech and Lang. Proc.},
 issue_date = {December 2017},
 volume = {25},
 number = {12},
 month = dec,
 year = {2017},
 issn = {2329-9290},
 pages = {2424--2432},
 numpages = {9},
 url = {https://doi.org/10.1109/TASLP.2017.2751420},
 doi = {10.1109/TASLP.2017.2751420},
 acmid = {3180106},
 publisher = {IEEE Press},
 address = {Piscataway, NJ, USA},
}

Reference

When developing this repository, I referred to the following projects:

Contact

For any questions or suggestions, please feel free to contact Biao Zhang