You are here

[Seminar] Recurrent Neural Networks: Recent Advances and the Future

Title: 
박사
Affiliation: 
vUniversity of Montreal
Date: 
Thursday, January 12th 2017, 11:00am - Thursday, January 12th 2017, 12:00pm
Location: 
302-309/1

■호스트: 장병탁 교수(x1833,880-1833)

Summary

Although recent resurgence of Recurrent Neural Networks (RNN) has achieved remarkable advances in various sequence modeling problems, we are still missing some key abilities of RNNs necessary to model more challenging yet important natural phenomena. In this talk, as a step toward such advances, I introduce a new RNN architecture, called Hierarchical Multiscale Recurrent Neural Networks (HM-RNN). In the HM-RNN, each layer in a multi-layered RNN learns different time-scales, adaptively to the inputs provided from the lower layer. I argue the advantages of HM-RNNs in terms of computational efficiency, capturing longer-term dependencies, and finding latent structures inherent in the sequence. I conclude the talk with a discussion on the key challenges that lie ahead.

Speaker Bio

Dr. Sungjin Ahn is currently a postdoctoral researcher at the University of Montreal, working with Prof. Yoshua Bengio on deep learning and its applications. He received his Ph.D. in Computer Science at the University of California, Irvine, under the supervision of Prof. Max Welling. During his Ph.D. program, Dr. Ahn has co-developed the Stochastic Gradient MCMC algorithms and awarded two best paper awards in the International Conference on Machine Learning in 2012 and in the ParLearning 2016, respectively. His research interests include deep learning, approximate Bayesian inference, and reinforcement learning