📕 subnode [[@KGBicheno/rectified_linear_unit_(relu)]]
in 📚 node [[rectified_linear_unit_(relu)]]
📓
garden/KGBicheno/Artificial Intelligence/Introduction to AI/Week 3 - Introduction/Definitions/Rectified_Linear_Unit_(Relu).md by @KGBicheno
Rectified Linear Unit (ReLU)
Go back to the [[AI Glossary]]
An activation function with the following rules:
If input is negative or zero, output is 0.
If input is positive, output is equal to input.
📖 stoas
- public document at doc.anagora.org/rectified_linear_unit_(relu)
- video call at meet.jit.si/rectified_linear_unit_(relu)