Xiang Ruan's Homepage

Anything I want to share with this world.

10 Deep Learning Trends at NIPS 2015

Brad Neuberg, a full stack product engineer at Dropbox, summarized 10 deep learning trends at NIPS 2015 on his personal blog.

You can check his article here.

The 10 trends are

  1. Neural network architectures are getting more complex and sophisticated
  2. All the cool kids are using LSTMs
  3. Attention models are showing up
  4. Neural Turing Machines remain interesting but aren't being leveraged yet for real work
  5. Computer vision and NLP aren't separate silos anymore — deep learning for computer vision and NLP are cross-hybridizing each other
  6. Symbolic differentiation is becoming even more important
  7. Surprising results are happening with neural network model compression
  8. The intersection of deep and reinforcement learning continues
  9. If you aren't using batch normalization you should
  10. Neural network research and productionisation go hand in hand

These trends have already appeared at many top tier conferences or workshops, Brad really gave a very good summary. I am personally very much interested in trend 6,8 and 10.

DL is no longer an academic issue, it becomes more and more engineering. I think, to some extent, DL is a signal showing that how matter hardware does than software or algorithm. Well, you know, it is kind of sad to guys like me, a software researcher/engineer. :-(

Comments

comments powered by Disqus