Hyperparameter Optimization on Neural Machine Translation
Date
Authors
Major Professor
Advisor
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Authors
Research Projects
Organizational Units
Journal Issue
Is Version Of
relationships.hasVersion
Series
Department
Abstract
With the growth of deep learning in the recent years, there have been several models created to tackle different real world goals, autonomously. One such goal is the automatic translation of text from one language to another. This is commonly known as Neural Machine Translation (NMT). NMT has proved to be a significant challenge to achieve, given the fluidity of human language. Most NMT models rely on Recurrent Neural Networks (RNNs) and deep Long Short-Term Memory networks (LSTMs). In this study, we will explore the Sequence to Sequence Learning with Neural Networks Model (Sutskever et al. (5)) and perform an ablation study of the model on two different data sets - the English-Vietnamese parallel corpus by the IWSLT Evaluation Campaign, and the German-English parallel corpus obtained from the WMT Evaluation Campaign.