Neural Machine Translation with Minimal Parallel Resources
We are interested in improving the quality of Neural Machine Translation (NMT) on language pairs that lack sizable parallel corpora, in settings where pivoting through a third language is not an option. We will explore two strategies in parallel:
- Learning the translation relation from monolingual corpora by bootstrapping from a small parallel corpus or dictionary.
- Developing NMT architectures capable of exploiting the linguistic knowledge expressed in richly annotated corpora.
See our research statement for more details.
JSALT 2017 MT Team
Senior Members
Colin Cherry.
NRC Canada
George Foster.
Google Research
Reza Haffari.
Monash
Patrick Littell.
CMU
David Mortensen.
CMU
Graduate Students
Daniel Beck.
Melbourne
Anna Currey.
Edinburgh
Vu (Cong Duy) Hoang.
Melbourne
Gaurav Kumar.
JHU
Undergraduate Students
Dylan Lewis.
JHU
Ji Xin.
Tsinghua
Affiliate Members
Michael Denkowski.
Amazon
Kevin Duh.
JHU
Deepak Gopinath.
Facebook
Philipp Koehn.
JHU
Felix Hieber.
Amazon