Kernel: Python 3
Logistic regression
In this notebook, we illustrate how to perform logistic regression on some small datasets. We will compare binary logistic regression as implemented by sklearn with our own implementation, for which we use a batch optimizer from scipy. We code the gradients by hand. We also show how to use the JAX autodiff package (see JAX AD colab).
In [1]:
In [2]:
Out[2]:
jax version 0.2.12
In [3]:
In [4]:
Out[4]:
[-4.41378437 -9.11061763 6.53872233 12.68572678]
In [5]:
Out[5]:
WARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)
[0.002 0. 1. 0.012 0.002 0. 0. 0.979 0.74 0. 0.706 0.
0. 0. 0. 0.001 1. 0. 0.009 1. 0. 0.65 0. 1.
0.094 0.998 1. 1. 0. 0. 0. 0. 0. 0. 0. 0.998
0. 0. 0. 0. 0.999 0. 0. 0. 0. 0. 0.281 0.909
0. 0.999]
[0.002 0. 1. 0.012 0.002 0. 0. 0.979 0.74 0. 0.706 0.
0. 0. 0. 0.001 1. 0. 0.009 1. 0. 0.65 0. 1.
0.094 0.998 1. 1. 0. 0. 0. 0. 0. 0. 0. 0.998
0. 0. 0. 0. 0.999 0. 0. 0. 0. 0. 0.281 0.909
0. 0.999]
In [6]:
Out[6]:
0.06907700925379459
0.06907699
nan
In [7]:
In [8]:
Out[8]:
[ 3.5801623e-08 7.0655005e-07 -9.9190243e-07 -1.4292980e-06]
[ 2.3841858e-08 6.9618227e-07 -1.0067224e-06 -1.4327467e-06]
[[0.80245787 0.36579472 0.6444712 0.2132109 ]
[0.36579472 0.1684845 0.29427886 0.09809215]
[0.64447117 0.29427886 0.5187084 0.17146072]
[0.21321094 0.09809215 0.17146073 0.05745751]]
[[0.80245805 0.36579484 0.6444712 0.21321094]
[0.36579484 0.1684845 0.29427883 0.09809214]
[0.6444711 0.29427883 0.51870865 0.17146075]
[0.21321093 0.09809214 0.17146075 0.05745751]]
In [9]:
Out[9]:
parameters from sklearn [-4.41378437 -9.11061763 6.53872233 12.68572678]
parameters from scipy-bfgs [-4.43822388 -9.04306242 6.52521732 12.7028332 ]
[0.002 0. 1. 0.012 0.002 0. 0. 0.979 0.732 0. 0.711 0.
0. 0. 0. 0.001 1. 0. 0.009 1. 0. 0.654 0. 1.
0.095 0.998 1. 1. 0. 0. 0. 0. 0. 0. 0. 0.998
0. 0. 0. 0. 0.999 0. 0. 0. 0. 0. 0.279 0.91
0. 0.999]
[0.002 0. 1. 0.012 0.002 0. 0. 0.979 0.74 0. 0.706 0.
0. 0. 0. 0.001 1. 0. 0.009 1. 0. 0.65 0. 1.
0.094 0.998 1. 1. 0. 0. 0. 0. 0. 0. 0. 0.998
0. 0. 0. 0. 0.999 0. 0. 0. 0. 0. 0.281 0.909
0. 0.999]
In [9]: