📚 node [[convergence]]

Convergence

convergence

Go back to the [[AI Glossary]]

Informally, often refers to a state reached during training in which training loss and validation loss change very little or not at all with each iteration after a certain number of iterations. In other words, a model reaches convergence when additional training on the current data will not improve the model. In deep learning, loss values sometimes stay constant or nearly so for many iterations before finally descending, temporarily producing a false sense of convergence.

See also early stopping.

See also Boyd and Vandenberghe, Convex Optimization.

📖 stoas
⥱ context
⥅ related node [[2006 07 26 openid bounties and identity convergence]]
⥅ related node [[the agora is a platform for studying convergence]]