Gradient descent is an optimization algorithm which operates iteratively and aims to minimize a function to it’s global minimum. This is considered a general purpose optimization algorithm used in many areas of mathematics. The animated figure attached above, illustrates how gradient descent fits the function curve iteratively. Although the image animates in a fast manner, this...