Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
AllenDowney
GitHub Repository: AllenDowney/bayesian-analysis-recipes
Path: blob/master/docs/notebooks/degrees-of-freedom.ipynb
419 views
Kernel: bayesian-analysis-recipes

Purpose

Just testing my intuition w.r.t. degrees of freedom in the students T distribution.

  • Cauchy: df = 1.

  • Normal: df = infinity (or at least some really high number)

This should be reflected when using PyMC3.

import pymc3 as pm import numpy as np import matplotlib.pyplot as plt import arviz as az %load_ext autoreload %autoreload 2 %matplotlib inline %config InlineBackend.figure_format = 'retina'
The autoreload extension is already loaded. To reload it, use: %reload_ext autoreload
normal = np.random.normal(size=20000) cauchy = np.random.standard_cauchy(size=20000)
with pm.Model() as normal_model: mu = pm.Normal("mu", mu=0, sd=100) sd = pm.HalfNormal("sd", sd=100) nu = pm.Exponential("nu", lam=0.5) like = pm.StudentT("like", mu=mu, sd=sd, nu=nu, observed=normal) trace = pm.sample(2000, tune=2000)
Auto-assigning NUTS sampler... Initializing NUTS using jitter+adapt_diag... Multiprocess sampling (4 chains in 4 jobs) NUTS: [nu, sd, mu] Sampling 4 chains, 0 divergences: 100%|██████████| 16000/16000 [00:20<00:00, 790.93draws/s]
axes = az.plot_trace(trace)
Image in a Jupyter notebook

Many degrees of freedom for normal distribution. Makes sense.

with pm.Model() as cauchy_model: mu = pm.Normal("mu", mu=0, sd=100) sd = pm.HalfNormal("sd", sd=100) nu = pm.Exponential("nu", lam=1) like = pm.StudentT("like", mu=mu, sd=sd, nu=nu, observed=cauchy) trace = pm.sample(2000, tune=2000)
Auto-assigning NUTS sampler... Initializing NUTS using jitter+adapt_diag... Multiprocess sampling (4 chains in 4 jobs) NUTS: [nu, sd, mu] Sampling 4 chains, 0 divergences: 100%|██████████| 16000/16000 [00:22<00:00, 709.93draws/s]
axes = az.plot_trace(trace)
Image in a Jupyter notebook

Basically 1 degree of freedom when inferring ν\nu from Cauchy-distributed data. Yes 😃.