Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
32 views18 pages

SGD 2

Stochastic Gradient Descent (SGD) is an iterative optimization algorithm used to find local minima of differentiable functions, commonly applied in machine learning to minimize cost functions. It updates model parameters after each training sample, allowing for faster convergence but potentially leading to noisy updates and longer convergence times. The document also discusses advantages and disadvantages of SGD compared to mini-batch and batch gradient descent methods.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
32 views18 pages

SGD 2

Stochastic Gradient Descent (SGD) is an iterative optimization algorithm used to find local minima of differentiable functions, commonly applied in machine learning to minimize cost functions. It updates model parameters after each training sample, allowing for faster convergence but potentially leading to noisy updates and longer convergence times. The document also discusses advantages and disadvantages of SGD compared to mini-batch and batch gradient descent methods.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 18

You might also like