📕 subnode [[@KGBicheno/mini batch]] in 📚 node [[mini-batch]]

mini-batch

Go back to the [[AI Glossary]]

A small, randomly selected subset of the entire batch of examples run together in a single iteration of training or inference. The batch size of a mini-batch is usually between 10 and 1,000. It is much more efficient to calculate the loss on a mini-batch than on the full training data.

📖 stoas
⥱ context