Sigmoid

Things to note:

\[y = \frac{1}{1+e^{-x}}\]

Sigmoid

Tanh vs. sigmoid

tanh

Things to note

Linear activation

ReLU

\[Max(0,x)\]

relu

Things to note:

Leaky ReLU

\[Max(.1x, x)\]

leaky

Things to Note: