Improving Deep Neural Networks
Hyperparameter Tuning, Regularization and Optimization

share ›
‹ links

Below are the top discussions from Reddit that mention this online Coursera course from DeepLearning.AI.

Offered by DeepLearning.AI. In the second course of the Deep Learning Specialization, you will open the deep learning black box to ... Enroll for free.

Reddsera may receive an affiliate commission if you enroll in a paid course after using these buttons to visit Coursera. Thank you for using these buttons to support Reddsera.

Taught by
Andrew Ng
Instructor
and 2 more instructors

Offered by
DeepLearning.AI

Reddit Posts and Comments

0 posts • 5 mentions • top 3 shown below

r/Physics • comment
1 points • akanimax

please refer to this video -> https://www.coursera.org/learn/deep-neural-network/lecture/y0m1f/gradient-descent-with-momentum for the concept of momentum in machine learning. The momentum simply uses the concept of running averages for making the updates. This equation is different from Momentum in two respects: 1.) this doesn't aggregate the gradients; and 2.) This has the cost term in the update rule which is not there in momentum or any other gradient descent variant as a matter of fact.

Thank you for the feedback.

r/MachineLearning • comment
1 points • chalupapa

https://www.coursera.org/learn/deep-neural-network/lecture/XjuhD/bias-correction-in-exponentially-weighted-averages

r/learnmachinelearning • comment
1 points • grudev

If you've never had a course I suggest you try some that focus on concepts like overfitting /underfitting and their relationship with the data you feed your model and the hyperparams you choose.

As a reference, take a look at this book:

https://www.deeplearning.ai/machine-learning-yearning/

And these courses:

https://www.coursera.org/learn/deep-neural-network?specialization=deep-learning

https://www.coursera.org/learn/machine-learning-projects?specialization=deep-learning