📕 subnode [[@KGBicheno/cross entropy]]
in 📚 node [[cross-entropy]]
📓
garden/KGBicheno/Artificial Intelligence/Introduction to AI/Week 3 - Introduction/Definitions/Cross-Entropy.md by @KGBicheno
cross-entropy
Go back to the [[AI Glossary]]
A generalization of Log Loss to multi-class classification problems. Cross-entropy quantifies the difference between two probability distributions. See also perplexity.
📖 stoas
- public document at doc.anagora.org/cross-entropy
- video call at meet.jit.si/cross-entropy