Stochastic Gradient Descent (SGD)
An iterative method for optimizing an objective function by updating parameters in the opposite direction of the gradient of the objective function.
An iterative method for optimizing an objective function by updating parameters in the opposite direction of the gradient of the objective function.