I am currently a Principal Member of Technical Staff at Oracle Digital Assistant, Oracle Corporation. My current focus is on intelligent chatbots, digital assistant using advanced technologies in NLP, NLU, Deep Learning.
From 2015 to 2018, I studied my PhD at the University of Melbourne (Australia) under the joint supervision of Assoc. Prof. Trevor Cohn and Assoc. Prof. Reza Haffari. My PhD research lies on Natural Language Processing and Applied Machine Learning, particularly on Deep Learning models (e.g., sequence to sequence learning and inference) applied to neural machine translation.
Prior to coming to Melbourne, I spent ~7 years (2008-2015) in Singapore for studying at National University of Singapore (NUS) (Masters) and then working as a R&D engineer at HLT department, Institute for Infocomm Research (I2R), A*STAR. Before that, I was a student/teaching & research assistant/lecturer at University of Science, Vietnam National University at Ho Chi Minh City, Vietnam. During this time, I was a research intern at National Institute of Informatics in Tokyo, Japan, working under Dr. Nigel Collier in a bio-text mining project (BioCaster).
*** My PhD viva has been completed and my PhD thesis is now available online.
*** On 10 May 2019, I joined Oracle Corporation, working for Oracle Digital Assistant.
*** On 15 Feb 2019, I submitted my thesis for final examination.
*** In Sep 2018, I joined Speak.AI (a new startup company headquartered in WA, USA) as an AI scientist, working on developing solutions for on-device conversational AI. Since May 2019, Speak.AI is a part of Oracle Corporation.
*** I was a research intern at NAVER LABS Europe (formerly as Xerox Research Centre Europe) from Mar 2018 to June 2018, working with Marc Dymetman on the project "Globally-driven Training Techniques for Neural Machine Translation".
*** Transformer-DyNet is my latest *humble* neural sequence-to-sequence toolkit (written in C++ with dynet backend). It implements Google's state-of-the-art Transformer architecture in a simplified manner. It's fast and efficient, can produce very high performance, consistently with Google's tensor2tensor or Amazon's Sockeye. This is the first C++ implementation of Transformer in DyNet (I suppose, correct me if I am wrong!).
*** I received the Google Australia PhD Travel Scholarship for my trip to EMNLP 2017. Special thanks to Google Australia.
*** I participated in the 2017 Jelinek Summer Workshop on Speech and Language Technology (JSALT) at CMU, June-August 2017, Pittsburgh, PA, USA. My main research focus will be on Neural Machine Translation conditioned on low/zero resources.