The Math (Enthusiast) Series invites everyone to consider math's role in optimizing deep neural networks on Tuesday Jan 23rd at 3:15 in Joy 114. Dr. Jacob Nelson will be visiting us from Microsoft Research and will give a talk titled "Training Deep Neural Networks: Optimization at Scale". Abstract: Today’s deep neural networks provide excellent accuracy for problems like automatic speech recognition, image recognition, and natural language processing. At the heart of the training process for creating new networks is a nonlinear optimization problem, whose properties make it hard to solve for large datasets efficiently. In this talk, I'll give an overview of how deep neural networks work and discuss how they are trained at scale today. Then I’ll describe some of our research to make this training process faster and more efficient. |