📚 node [[convergence]]
Convergence
-
[[convergence]] feels great.
- It is a form of [[synthesis]].
- I kind of [[love]] it.
- I dream of a [[world]] of convergence in [[open protocols]], [[open source]], [[open ethics]].
- It is my aspiration to [[build bridges]].
- https://twitter.com/flancian/status/1370824180028047363
📓
garden/KGBicheno/Artificial Intelligence/Introduction to AI/Week 3 - Introduction/Definitions/Convergence.md by @KGBicheno
convergence
Go back to the [[AI Glossary]]
Informally, often refers to a state reached during training in which training loss and validation loss change very little or not at all with each iteration after a certain number of iterations. In other words, a model reaches convergence when additional training on the current data will not improve the model. In deep learning, loss values sometimes stay constant or nearly so for many iterations before finally descending, temporarily producing a false sense of convergence.
See also early stopping.
See also Boyd and Vandenberghe, Convex Optimization.
📖 stoas
- public document at doc.anagora.org/convergence
- video call at meet.jit.si/convergence
⥱ context
↑ pushing here
(none)
(none)
↓ pulling this
(none)
(none)
⥅ related node [[2006 07 26 openid bounties and identity convergence]]
⥅ related node [[the agora is a platform for studying convergence]]
🔎 full text search for 'convergence'