We are working on the course content with rocket speed !! sooner we will be loaded with all the courses

Course 5 : Machine learning – using gradients

Course Content

Gradients and descend
In the context of neural networks, gradients refer to the partial derivatives of the loss function with respect to the model parameters (weights and biases). They represent the direction and magnitude of the steepest ascent of the loss function. Gradients indicate how the loss function changes concerning small changes in each parameter. A positive gradient implies an increase in the loss, while a negative gradient implies a decrease. Computing gradients is essential for training neural networks because they guide the optimization process towards minimizing the loss function.

  • What are derivatives in Machine learning
  • Gradient descent (minimization problems)
  • Loss functions – regression & classification
  • Implement linear regression with GD
  • Implement Logistic regression with GD
  • Types of Gradients descent

Student Ratings & Reviews

No Review Yet
No Review Yet