📕 subnode [[@KGBicheno/mini batch]]
in 📚 node [[mini-batch]]
📓
garden/KGBicheno/Artificial Intelligence/Introduction to AI/Week 3 - Introduction/Definitions/Mini-Batch.md by @KGBicheno
mini-batch
Go back to the [[AI Glossary]]
A small, randomly selected subset of the entire batch of examples run together in a single iteration of training or inference. The batch size of a mini-batch is usually between 10 and 1,000. It is much more efficient to calculate the loss on a mini-batch than on the full training data.
📖 stoas
- public document at doc.anagora.org/mini-batch
- video call at meet.jit.si/mini-batch