📕 subnode [[@KGBicheno/mini batch_stochastic_gradient_descent_(sgd)]]
in 📚 node [[mini-batch_stochastic_gradient_descent_(sgd)]]
📓
garden/KGBicheno/Artificial Intelligence/Introduction to AI/Week 3 - Introduction/Definitions/Mini-Batch_Stochastic_Gradient_Descent_(Sgd).md by @KGBicheno
mini-batch stochastic gradient descent (SGD)
Go back to the [[AI Glossary]]
A gradient descent algorithm that uses mini-batches. In other words, mini-batch SGD estimates the gradient based on a small subset of the training data. Vanilla SGD uses a mini-batch of size 1.
📖 stoas
- public document at doc.anagora.org/mini-batch_stochastic_gradient_descent_(sgd)
- video call at meet.jit.si/mini-batch_stochastic_gradient_descent_(sgd)