Improving Deep Neural Networks
Hyperparameter tuning, Regularization and Optimization

share ›
‹ links

Below are the top discussions from Reddit that mention this online Coursera course from DeepLearning.AI.

This course will teach you the "magic" of getting deep learning to work well.

Hyperparameter Tensorflow Hyperparameter Optimization Deep Learning

Reddsera may receive an affiliate commission if you enroll in a paid course after using these buttons to visit Coursera. Thank you for using these buttons to support Reddsera.

Taught by
Andrew Ng
Instructor
and 2 more instructors

Offered by
DeepLearning.AI

Reddit Posts and Comments

0 posts • 4 mentions • top 2 shown below

r/Physics • comment
1 points • akanimax

please refer to this video -> https://www.coursera.org/learn/deep-neural-network/lecture/y0m1f/gradient-descent-with-momentum for the concept of momentum in machine learning. The momentum simply uses the concept of running averages for making the updates. This equation is different from Momentum in two respects: 1.) this doesn't aggregate the gradients; and 2.) This has the cost term in the update rule which is not there in momentum or any other gradient descent variant as a matter of fact.

Thank you for the feedback.

r/MachineLearning • comment
1 points • chalupapa

https://www.coursera.org/learn/deep-neural-network/lecture/XjuhD/bias-correction-in-exponentially-weighted-averages