Abstract is missing.
- Empirical Study of Dropout Scheme for Neural Machine TranslationXiaolin Wang 0002, Masao Utiyama, Eiichiro Sumita. 1-14 [doi]
- A Target Attention Model for Neural Machine TranslationHideya Mino, Andrew M. Finch, Eiichiro Sumita. 15-26 [doi]
- Neural Pre-Translation for Hybrid Machine TranslationJinhua Du, Andy Way. 27-40 [doi]
- Neural and Statistical Methods for Leveraging Meta-information in Machine TranslationShahram Khadivi, Patrick Wilken, Leonard Dahlmann, Evgeny Matusov. 41-54 [doi]
- Translation Quality and Productivity: A Study on Rich Morphology LanguagesLucia Specia, Kim Harris, Frédéric Blain, Aljoscha Burchardt, Vivien Macketanz, Inguna Skadina, Matteo Negri, Marco Turchi. 55-71 [doi]
- The Microsoft Speech Language Translation (MSLT) Corpus for Chinese and Japanese: Conversational Test data for Machine Translation and Speech RecognitionChristian Federmann, William D. Lewis. 72-85 [doi]
- Paying Attention to Multi-Word Expressions in Neural Machine TranslationMatiss Rikters, Ondrej Bojar. 86-95 [doi]
- Enabling Multi-Source Neural Machine Translation By Concatenating Source Sentences In Multiple LanguagesRaj Dabre, Fabien Cromières, Sadao Kurohashi. 96-107 [doi]
- Learning an Interactive Attention Policy for Neural Machine TranslationSamee Ibraheem, Nicholas Altieri 0001, John DeNero. 108-115 [doi]
- A Comparative Quality Evaluation of PBSMT and NMT using Professional TranslatorsSheila Castilho, Joss Moorkens, Federico Gaspari, Rico Sennrich, Vilelmini Sosoni, Panayota Georgakopoulou, Pintu Lohar, Andy Way, Antonio Valerio Miceli Barone, Maria Gialama. 116-131 [doi]
- One-parameter models for sentence-level post-editing effort estimationMikel L. Forcada, Miquel Esplà-Gomis, Felipe Sánchez-Martínez, Lucia Specia. 132-143 [doi]
- A Minimal Cognitive Model for Translating and Post-editingMoritz Jonas Schaeffer, Michael Carl. 144-155 [doi]
- Fine-Tuning for Neural Machine Translation with Limited Degradation across In- and Out-of-Domain DataPraveen Dakwale, Christof Monz. 156-169 [doi]
- Exploiting Relative Frequencies for Data SelectionThierry Etchegoyhen, Andoni Azpeitia, Eva Martínez García. 170-184 [doi]
- Low Resourced Machine Translation via Morpho-syntactic Modeling: The Case of Dialectal ArabicAlexander Erdmann, Nizar Habash, Dima Taji, Houda Bouamor. 185-200 [doi]
- Elastic-substitution decoding for Hierarchical SMT: efficiency, richer search and double labelsGideon Maillette de Buy Wenniger, Khalil Sima'an, Andy Way. 201-215 [doi]
- Development of a classifiers/quantifiers dictionary towards French-Japanese MTMutsuko Tomokiyo, Mathieu Mangeot, Christian Boitet. 216-226 [doi]
- Neural Machine Translation Model with a Large Vocabulary Selected by Branching EntropyZi Long, Ryuichiro Kimura, Takehito Utsuro, Tomoharu Mitsuhashi, Mikio Yamamoto. 227-240 [doi]
- Usefulness of MT output for comprehension - an analysis from the point of view of linguistic intercomprehensionKenneth Jordan Núñez, Mikel L. Forcada, Esteve Clua. 241-253 [doi]
- Machine Translation as an Academic Writing Aid for Medical PractitionersCarla Parra Escartín, Sharon O'Brien, Marie-Josée Goulet, Michel Simard. 254-267 [doi]
- A Multilingual Parallel Corpus for Improving Machine Translation on Southeast Asian LanguagesHai-Long Trieu, Le Minh Nguyen. 268-281 [doi]
- Exploring Hypotheses Spaces in Neural Machine TranslationFrédéric Blain, Lucia Specia, Pranava Madhyastha. 282-298 [doi]
- Confidence through AttentionMatiss Rikters, Mark Fishel. 299-311 [doi]
- Disentangling ASR and MT Errors in Speech TranslationNgoc-Tien Le, Benjamin Lecouteux, Laurent Besacier. 312-323 [doi]
- Temporality as Seen through Translation: A Case Study on Hindi TextsSabyasachi Kamila, Sukanta Sen, Mohammad Hasanuzzaman, Asif Ekbal, Andy Way, Pushpak Bhattacharyya. 324-336 [doi]
- A Neural Network Transliteration Model in Low Resource SettingsTan Le, Fatiha Sadat. 337-345 [doi]