📕 subnode [[@KGBicheno/mini batch_stochastic_gradient_descent_(sgd)]] in 📚 node [[mini-batch_stochastic_gradient_descent_(sgd)]]

mini-batch stochastic gradient descent (SGD)

Go back to the [[AI Glossary]]

A gradient descent algorithm that uses mini-batches. In other words, mini-batch SGD estimates the gradient based on a small subset of the training data. Vanilla SGD uses a mini-batch of size 1.

📖 stoas
⥱ context