We usually meet each Sunday Bangladesh time 8 pm.
Tutorial
Deep Learning
Tutorial 3 - Optimization in Deep Learning
Toshiba Zaman, Shahad Mahmud, Zannatul Naim Shanto
- Gradiend descent: batch, stochastic, mini-batch gradient descent
- Optimization algorithms: SGD, Momentum, Adagrad, Adadelta, Adam
- Notebook comparison of optimization algorithms