Kernel: Python 3
Pyro is a probabilistic programming system built on top of PyTorch. It supports posterior inference based on MCMC and stochastic variational inference; discrete latent variables can be marginalized out exactly using dynamic programmming.
In [1]:
Out[1]:
Collecting pyro-ppl
Downloading https://files.pythonhosted.org/packages/aa/7a/fbab572fd385154a0c07b0fa138683aa52e14603bb83d37b198e5f9269b1/pyro_ppl-1.6.0-py3-none-any.whl (634kB)
|████████████████████████████████| 634kB 5.4MB/s
Requirement already satisfied: torch>=1.8.0 in /usr/local/lib/python3.7/dist-packages (from pyro-ppl) (1.8.1+cu101)
Collecting pyro-api>=0.1.1
Downloading https://files.pythonhosted.org/packages/fc/81/957ae78e6398460a7230b0eb9b8f1cb954c5e913e868e48d89324c68cec7/pyro_api-0.1.2-py3-none-any.whl
Requirement already satisfied: numpy>=1.7 in /usr/local/lib/python3.7/dist-packages (from pyro-ppl) (1.19.5)
Requirement already satisfied: tqdm>=4.36 in /usr/local/lib/python3.7/dist-packages (from pyro-ppl) (4.41.1)
Requirement already satisfied: opt-einsum>=2.3.2 in /usr/local/lib/python3.7/dist-packages (from pyro-ppl) (3.3.0)
Requirement already satisfied: typing-extensions in /usr/local/lib/python3.7/dist-packages (from torch>=1.8.0->pyro-ppl) (3.7.4.3)
Installing collected packages: pyro-api, pyro-ppl
Successfully installed pyro-api-0.1.2 pyro-ppl-1.6.0
In [47]:
Example: inferring mean of 1d Gaussian .
We use the simple example from the Pyro intro. The goal is to infer the weight of an object, given noisy measurements . We assume the following model:
Where is the initial guess.
In [110]:
Exact inference
By Bayes rule for Gaussians, we know that the exact posterior, given a single observation , is given by
In [93]:
Out[93]:
9.14
0.6
Ancestral sampling
In [68]:
Out[68]:
[tensor(9.1529), tensor(8.7116)]
[tensor(8.7306), tensor(9.3978)]
[tensor(9.0740), tensor(9.4240)]
[tensor(7.3040), tensor(7.8569)]
[tensor(7.8939), tensor(8.0257)]
MCMC
In [100]:
Out[100]:
Sample: 100%|██████████| 1050/1050 [00:03, 326.67it/s, step size=1.30e+00, acc. prob=0.880]
<class 'pyro.infer.mcmc.api.MCMC'>
In [103]:
Out[103]:
<class 'dict'>
dict_keys(['theta'])
torch.Size([1000])
In [104]:
Out[104]:
{'acceptance rate': {'chain 0': 0.924},
'divergences': {'chain 0': []},
'theta': OrderedDict([('n_eff', tensor(500.2368)),
('r_hat', tensor(1.0050))])}
In [105]:
Out[105]:
9.152181
0.625822
Variational Inference
For the guide (approximate posterior), we use a pytorch.distributions.normal.
In [60]:
Out[60]:
/usr/local/lib/python3.7/dist-packages/ipykernel_launcher.py:5: UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than torch.tensor(sourceTensor).
"""
[9.205821990966797, 0.6176195740699768]
In [125]:
In [124]:
Out[124]:
[tensor(1.), tensor(1.), tensor(1.), tensor(1.), tensor(1.), tensor(1.), tensor(0.), tensor(0.), tensor(0.), tensor(0.)]
[1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0]
[6.0, 4.0]
In [116]:
Out[116]:
[tensor(1.), tensor(1.), tensor(1.), tensor(1.), tensor(1.), tensor(1.), tensor(0.), tensor(0.), tensor(0.), tensor(0.)]
[1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0]
Exact inference
The posterior is given by
In [91]:
Out[91]:
exact posterior: alpha=16.000, beta=14.000
[0.5333333333333333, 0.08960286733763294]
MCMC
In [114]:
Out[114]:
Sample: 100%|██████████| 1050/1050 [00:09, 111.12it/s, step size=1.50e+00, acc. prob=0.803]
{'theta': OrderedDict([('n_eff', tensor(443.8569)), ('r_hat', tensor(0.9992))]), 'divergences': {'chain 0': []}, 'acceptance rate': {'chain 0': 0.864}}
torch.Size([1000])
In [115]:
Out[115]:
0.5330437
0.09079484
In [126]:
Out[126]:
Sample: 100%|██████████| 1050/1050 [00:08, 117.55it/s, step size=9.57e-01, acc. prob=0.919]
{'theta': OrderedDict([('n_eff', tensor(269.4737)), ('r_hat', tensor(0.9990))]), 'divergences': {'chain 0': []}, 'acceptance rate': {'chain 0': 0.951}}
torch.Size([1000])
In [127]:
Out[127]:
0.48617417
0.112258926
Variational inference
In [82]:
In [89]:
Out[89]:
In [92]:
Out[92]:
variational posterior: alpha=15.414, beta=14.094
[0.5223745147578196, 0.09043264875842827]
In [ ]: