[Seminar] Linguistic Knowledge Transfer for Enriching Vector Representations

김주경
박사
Amazon Alexa
Date: 
Tuesday, December 26th 2017, 11:00am - Tuesday, December 26th 2017, 12:00pm
Location: 
302-308

■호스트: 장병탁 교수(x1833, 880-1833)
■문의: 바이오 지능 연구실: 바이오지능연구실 (880-1847)

Summary

Many neural network models require a large number of labeled training examples for the sufficient model training. This talk focuses on transfer learning methods, which can improve the performance of the target tasks in such situations by leveraging resources or models for other tasks. Specifically, we introduce transfer learning methods for enriching word or sentence vector representations of neural network models by transferring linguistic knowledge. As word-level knowledge transfer, we show that enriching word embeddings with semantic lexicons such as thesauri and semantic intensity orders can improve the performance of both word-level and sentence-level NLP tasks. As sentence-level knowledge transfer, we introduce cross-domain and cross-lingual transfer learning methods utilizing both common/private representations, adversarial training, and other auxiliary objectives to improve the performance of sequence tagging tasks without specific domain/linguistic resources.

Speaker Bio

Joo-Kyung Kim is an applied scientist at Amazon Alexa working on deep learning for large-scale dialog systems. He received his Ph.D. from the Ohio State University advised by Eric Fosler-Lussier. During his Ph.D, he interned at Microsoft, NEC Laboratories, and Nuance working on deep learning for natural language understanding. He obtained a M.S. from Seoul National University advised by Byoung-Tak Zhang and a B.S. from Sogang University.