📕 subnode [[@KGBicheno/batch_normalization]]
in 📚 node [[batch_normalization]]
📓
garden/KGBicheno/Artificial Intelligence/Introduction to AI/Week 3 - Introduction/Definitions/Batch_Normalization.md by @KGBicheno
batch normalization
Go back to the [[AI Glossary]]
Normalizing the input or output of the activation functions in a hidden layer. Batch normalization can provide the following benefits:
Make neural networks more stable by protecting against outlier weights.
Enable higher learning rates.
Reduce overfitting.
📖 stoas
- public document at doc.anagora.org/batch_normalization
- video call at meet.jit.si/batch_normalization