📕 subnode [[@KGBicheno/rectified_linear_unit_(relu)]] in 📚 node [[rectified_linear_unit_(relu)]]

Rectified Linear Unit (ReLU)

Go back to the [[AI Glossary]]

An activation function with the following rules:

If input is negative or zero, output is 0.
If input is positive, output is equal to input.
📖 stoas
⥱ context