10 Deep Learning Trends at NIPS 2015
Brad Neuberg, a full stack product engineer at Dropbox, summarized 10 deep learning trends at NIPS 2015 on his personal blog.
You can check his article here.
The 10 trends are
- Neural network architectures are getting more complex and sophisticated
- All the cool kids are using LSTMs
- Attention models are showing up
- Neural Turing Machines remain interesting but aren't being leveraged yet for real work
- Computer vision and NLP aren't separate silos anymore — deep learning for computer vision and NLP are cross-hybridizing each other
- Symbolic differentiation is becoming even more important
- Surprising results are happening with neural network model compression
- The intersection of deep and reinforcement learning continues
- If you aren't using batch normalization you should
- Neural network research and productionisation go hand in hand
These trends have already appeared at many top tier conferences or workshops, Brad really gave a very good summary. I am personally very much interested in trend 6,8 and 10.
DL is no longer an academic issue, it becomes more and more engineering. I think, to some extent, DL is a signal showing that how matter hardware does than software or algorithm. Well, you know, it is kind of sad to guys like me, a software researcher/engineer. :-(