Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
codebasics
GitHub Repository: codebasics/deep-learning-keras-tf-tutorial
Path: blob/master/2_activation_functions/2_activation_functions.ipynb
1141 views
Kernel: Python 3

Implementation of activation functions in python

Sigmoid

import math def sigmoid(x): return 1 / (1 + math.exp(-x))
sigmoid(100)
1.0
sigmoid(1)
0.7310585786300049
sigmoid(-56)
4.780892883885469e-25
sigmoid(0.5)
0.6224593312018546

tanh

def tanh(x): return (math.exp(x) - math.exp(-x)) / (math.exp(x) + math.exp(-x))
tanh(-56)
-1.0
tanh(50)
1.0
tanh(1)
0.7615941559557649

ReLU

def relu(x): return max(0,x)
relu(-100)
0
relu(8)
8

Leaky ReLU

def leaky_relu(x): return max(0.1*x,x)
leaky_relu(-100)
-10.0
leaky_relu(8)
8