Path: blob/master/2_activation_functions/2_activation_functions.ipynb
1141 views
Kernel: Python 3
Implementation of activation functions in python
Sigmoid
In [1]:
In [2]:
Out[2]:
1.0
In [3]:
Out[3]:
0.7310585786300049
In [4]:
Out[4]:
4.780892883885469e-25
In [5]:
Out[5]:
0.6224593312018546
tanh
In [6]:
In [7]:
Out[7]:
-1.0
In [8]:
Out[8]:
1.0
In [9]:
Out[9]:
0.7615941559557649
ReLU
In [10]:
In [15]:
Out[15]:
0
In [14]:
Out[14]:
8
Leaky ReLU
In [16]:
In [17]:
Out[17]:
-10.0
In [18]:
Out[18]:
8