Abstract is missing.
- Overview of the 8th Workshop on Asian TranslationToshiaki Nakazawa, Hideki Nakayama, Chenchen Ding, Raj Dabre, Shohei Higashiyama, Hideya Mino, Isao Goto, Win Pa Pa, Anoop Kunchukuttan, Shantipriya Parida, Ondrej Bojar, Chenhui Chu, Akiko Eriguchi, Kaori Abe, Yusuke Oda, Sadao Kurohashi. 1-45 [doi]
- NHK's Lexically-Constrained Neural Machine Translation at WAT 2021Hideya Mino, Kazutaka Kinugawa, Hitoshi Ito, Isao Goto, Ichiro Yamada, Takenobu Tokunaga. 46-52 [doi]
- Input Augmentation Improves Constrained Beam Search for Neural Machine Translation: NTT at WAT 2021Katsuki Chousa, Makoto Morishita. 53-61 [doi]
- NICT's Neural Machine Translation Systems for the WAT21 Restricted Translation TaskZuchao Li, Masao Utiyama, Eiichiro Sumita, Hai Zhao. 62-67 [doi]
- Machine Translation with Pre-specified Target-side Words Using a Semi-autoregressive ModelSeiichiro Kondo, Aomi Koyama, Tomoshige Kiyuna, Tosho Hirasawa, Mamoru Komachi. 68-73 [doi]
- NECTEC's Participation in WAT-2021Zar Zar Hlaing, Ye Kyaw Thu, Thazin Myint Oo, Mya Ei San, Sasiporn Usanavasin, Ponrudee Netisopakul, Thepchai Supnithi. 74-82 [doi]
- Hybrid Statistical Machine Translation for English-Myanmar: UTYCC Submission to WAT-2021Ye Kyaw Thu, Thazin Myint Oo, Hlaing Myat Nwe, Khaing Zar Mon, Nang Aeindray Kyaw, Naing Linn Phyo, Nann Hwan Khun, Hnin Aye Thant. 83-89 [doi]
- NICT-2 Translation System at WAT-2021: Applying a Pretrained Multilingual Encoder-Decoder Model to Low-resource Language PairsKenji Imamura, Eiichiro Sumita. 90-95 [doi]
- Rakuten's Participation in WAT 2021: Examining the Effectiveness of Pre-trained Models for Multilingual and Multimodal Machine TranslationRaymond Hendy Susanto, Dongzhe Wang, Sunil Yadav, Mausam Jain, Ohnmar Htun. 96-105 [doi]
- BTS: Back TranScription for Speech-to-Text Post-Processor using Text-to-Speech-to-TextChanjun Park, Jaehyung Seo, Seolhwa Lee, Chanhee Lee, Hyeonseok Moon, Sugyeong Eo, HeuiSeok Lim. 106-116 [doi]
- Zero-pronoun Data Augmentation for Japanese-to-English TranslationRyokan Ri, Toshiaki Nakazawa, Yoshimasa Tsuruoka. 117-123 [doi]
- Evaluation Scheme of Focal Translation for Japanese Partially Amended StatutesTakahiro Yamakoshi, Takahiro Komamizu, Yasuhiro Ogawa, Katsuhiko Toyama. 124-132 [doi]
- TMU NMT System with Japanese BART for the Patent task of WAT 2021Hwichan Kim, Mamoru Komachi. 133-137 [doi]
- System Description for TransperfectWiktor Stribizew, Fred Bane, José Conceição, Anna Zaretskaya. 138-140 [doi]
- Bering Lab's Submissions on WAT 2021 Shared TaskHeesoo Park, Dongjun Lee. 141-145 [doi]
- NLPHut's Participation at WAT2021Shantipriya Parida, Subhadarshi Panda, Ketan Kotwal, Amulya Ratna Dash, Satya Ranjan Dash, Yashvardhan Sharma, Petr Motlícek, Ondrej Bojar. 146-154 [doi]
- Improved English to Hindi Multimodal Neural Machine TranslationSahinur Rahman Laskar, Abdullah Faiz Ur Rahman Khilji, Darsh Kaushik, Partha Pakray, Sivaji Bandyopadhyay. 155-160 [doi]
- IITP at WAT 2021: System description for English-Hindi Multimodal Translation TaskBaban Gain, Dibyanayan Bandyopadhyay, Asif Ekbal. 161-165 [doi]
- ViTA: Visual-Linguistic Translation by Aligning Object TagsKshitij Gupta, Devansh Gautam, Radhika Mamidi. 166-173 [doi]
- TMEKU System for the WAT2021 Multimodal Translation TaskYuting Zhao, Mamoru Komachi, Tomoyuki Kajiwara, Chenhui Chu. 174-180 [doi]
- Optimal Word Segmentation for Neural Machine Translation into Dravidian LanguagesPrajit Dhar, Arianna Bisazza, Gertjan van Noord. 181-190 [doi]
- Itihasa: A large-scale corpus for Sanskrit to English translationRahul Aralikatte, Miryam de Lhoneux, Anoop Kunchukuttan, Anders Søgaard. 191-197 [doi]
- NICT-5's Submission To WAT 2021: MBART Pre-training And In-Domain Fine Tuning For Indic LanguagesRaj Dabre, Abhisek Chakrabarty. 198-204 [doi]
- How far can we get with one GPU in 100 hours? CoAStaL at MultiIndicMT Shared TaskRahul Aralikatte, Héctor Ricardo Murrieta Bello, Daniel Hershcovich, Marcel Bollmann, Anders Søgaard. 205-211 [doi]
- IIIT Hyderabad Submission To WAT 2021: Efficient Multilingual NMT systems for Indian languagesSourav Kumar, Salil Aggarwal, Dipti Sharma. 212-216 [doi]
- Language Relatedness and Lexical Closeness can help Improve Multilingual NMT: IITBombay@MultiIndicNMT WAT2021Jyotsana Khatri, Nikhil Saini, Pushpak Bhattacharyya. 217-223 [doi]
- Samsung R&D Institute Poland submission to WAT 2021 Indic Language Multilingual TaskAdam Dobrowolski, Marcin Szymanski, Marcin Chochowski, Pawel Przybysz. 224-232 [doi]
- Multilingual Machine Translation Systems at WAT 2021: One-to-Many and Many-to-One Transformer based NMTShivam Mhaskar, Aditya Jain, Aakash Banerjee, Pushpak Bhattacharyya. 233-237 [doi]
- IITP-MT at WAT2021: Indic-English Multilingual Neural Machine Translation using Romanized VocabularyRamakrishna Appicharla, Kamal Kumar Gupta, Asif Ekbal, Pushpak Bhattacharyya. 238-243 [doi]
- ANVITA Machine Translation System for WAT 2021 MultiIndicMT Shared TaskPavanpankaj Vegi, Sivabhavani J., Biswajit Paul, Chitra Viswanathan, K. R. Prasanna Kumar. 244-249 [doi]