📕 subnode [[@KGBicheno/vanishing_gradient_problem]] in 📚 node [[vanishing_gradient_problem]]

vanishing gradient problem

Go back to the [[AI Glossary]]

#seq

The tendency for the gradients of early hidden layers of some deep neural networks to become surprisingly flat (low). Increasingly lower gradients result in increasingly smaller changes to the weights on nodes in a deep neural network, leading to little or no learning. Models suffering from the vanishing gradient problem become difficult or impossible to train. Long Short-Term Memory cells address this issue.

Compare to exploding gradient problem.

W

📖 stoas
⥱ context