Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
Ok-landscape
GitHub Repository: Ok-landscape/computational-pipeline
Path: blob/main/notebooks/published/bayesian_inference_mcmc/bayesian_inference_mcmc_posts.txt
51 views
unlisted
1
# Social Media Posts: Bayesian Inference with MCMC
2
# Generated from: bayesian_inference_mcmc.ipynb
3
4
================================================================================
5
## SHORT-FORM POSTS
6
================================================================================
7
8
### Twitter/X (< 280 chars)
9
--------------------------------------------------------------------------------
10
What if you could update your beliefs mathematically?
11
12
Bayesian inference does exactly that. Built an MCMC sampler from scratch to estimate hidden parameters from noisy data.
13
14
P(θ|D) ∝ P(D|θ)·P(θ)
15
16
#Python #BayesianStats #MCMC #DataScience
17
18
--------------------------------------------------------------------------------
19
20
### Bluesky (< 300 chars)
21
--------------------------------------------------------------------------------
22
Implemented Metropolis-Hastings MCMC from scratch to demonstrate Bayesian inference.
23
24
Given 100 noisy observations, the sampler recovered the true parameters (μ=5, σ=2) with tight 95% credible intervals.
25
26
The posterior tells the full story of parameter uncertainty.
27
28
#Statistics #Python #MachineLearning
29
30
--------------------------------------------------------------------------------
31
32
### Threads (< 500 chars)
33
--------------------------------------------------------------------------------
34
Ever wondered how scientists update their beliefs with new data?
35
36
That's Bayesian inference in a nutshell.
37
38
I built a Markov Chain Monte Carlo sampler from scratch. Fed it 100 noisy measurements. It figured out the hidden parameters behind the data.
39
40
The magic: you don't just get point estimates. You get full probability distributions showing exactly how certain (or uncertain) you should be.
41
42
The acceptance rate hit 23% - right in the optimal zone for 2D problems.
43
44
Full interactive notebook in my bio.
45
46
--------------------------------------------------------------------------------
47
48
### Mastodon (< 500 chars)
49
--------------------------------------------------------------------------------
50
New computational notebook: Bayesian Inference with MCMC
51
52
Implemented Metropolis-Hastings from scratch to infer parameters of a normal distribution.
53
54
Key results:
55
• 50,000 iterations, 10,000 burn-in
56
• Acceptance rate: 23% (optimal ~23.4% for 2D)
57
• ESS shows good mixing
58
• Posterior predictive checks validate model fit
59
60
The posterior captures full uncertainty:
61
μ: mean ≈ 5.0, 95% CI recovered true value
62
σ: mean ≈ 2.0, 95% CI recovered true value
63
64
Bayes' theorem: P(θ|D) ∝ P(D|θ)·P(θ)
65
66
#Statistics #Bayesian #MCMC #Python #Science
67
68
--------------------------------------------------------------------------------
69
70
================================================================================
71
## LONG-FORM POSTS
72
================================================================================
73
74
### Reddit (r/learnpython or r/statistics)
75
--------------------------------------------------------------------------------
76
**Title:** I built an MCMC sampler from scratch to learn Bayesian inference - here's what clicked
77
78
**Body:**
79
80
Hey everyone! I've been learning Bayesian statistics and finally built something that made it all click: a Metropolis-Hastings MCMC sampler from scratch in Python (numpy/scipy only, no PyMC or Stan).
81
82
**The Problem:**
83
You have 100 noisy measurements. You know they came from a normal distribution, but you don't know the mean (μ) or standard deviation (σ). How do you figure them out AND quantify your uncertainty?
84
85
**The Bayesian Approach:**
86
Instead of point estimates, Bayes gives you probability distributions:
87
88
P(θ|D) ∝ P(D|θ) · P(θ)
89
90
- P(θ|D) = posterior (what we want - our updated beliefs)
91
- P(D|θ) = likelihood (how probable is our data given parameters)
92
- P(θ) = prior (what we believed before seeing data)
93
94
**The Challenge:**
95
Computing the posterior exactly requires an integral that's often impossible to solve. That's where MCMC comes in - it samples from the posterior without computing the nasty integral.
96
97
**How Metropolis-Hastings Works (ELI5):**
98
1. Start somewhere in parameter space
99
2. Propose a random step
100
3. If the new spot has higher probability, accept it
101
4. If lower probability, accept it sometimes (proportional to the ratio)
102
5. Repeat 50,000 times
103
6. The places you visit most often ARE the posterior distribution
104
105
**Results:**
106
- True values: μ=5.0, σ=2.0
107
- Posterior recovered both with tight 95% credible intervals
108
- Acceptance rate hit 23% (textbook says optimal is ~23.4% for 2D)
109
- Trace plots show good mixing after burn-in
110
111
**What I Learned:**
112
- Burn-in matters - first 10,000 samples are garbage while the chain finds its footing
113
- Proposal step size is crucial - too small = slow exploration, too big = lots of rejections
114
- Effective Sample Size (ESS) tells you how many independent samples you really have
115
- Posterior predictive checks are essential - sample from your posterior, generate fake data, compare to real data
116
117
Check out the full interactive notebook with code and visualizations:
118
https://cocalc.com/github/Ok-landscape/computational-pipeline/blob/main/notebooks/published/bayesian_inference_mcmc.ipynb
119
120
Happy to answer questions about the implementation!
121
122
--------------------------------------------------------------------------------
123
124
### Facebook (< 500 chars)
125
--------------------------------------------------------------------------------
126
Ever wonder how scientists deal with uncertainty?
127
128
I built a computer program that learns from data the Bayesian way. Instead of saying "the answer is exactly 5," it says "the answer is probably between 4.6 and 5.4, and here's how confident I am."
129
130
Fed it 100 noisy measurements. It figured out the hidden truth AND told me exactly how sure it was.
131
132
This is how modern AI handles uncertainty too.
133
134
Interactive notebook: https://cocalc.com/github/Ok-landscape/computational-pipeline/blob/main/notebooks/published/bayesian_inference_mcmc.ipynb
135
136
--------------------------------------------------------------------------------
137
138
### LinkedIn (< 1000 chars)
139
--------------------------------------------------------------------------------
140
Bayesian Inference & MCMC: A Hands-On Implementation
141
142
Just completed a deep dive into Bayesian statistics by implementing a Metropolis-Hastings MCMC sampler from scratch.
143
144
The Project:
145
Built a computational notebook demonstrating how to infer unknown parameters from noisy data using only numpy, scipy, and matplotlib - no black-box probabilistic programming libraries.
146
147
Technical Highlights:
148
• Implemented the full Metropolis-Hastings algorithm with random walk proposals
149
• Achieved optimal acceptance rate (~23%) for 2D parameter space
150
• Developed convergence diagnostics: trace plots, autocorrelation analysis, ESS calculation
151
• Validated results with posterior predictive checks
152
153
Key Insight:
154
The power of Bayesian inference lies in uncertainty quantification. Rather than point estimates, we get full posterior distributions - invaluable for decision-making under uncertainty.
155
156
This methodology underpins modern applications in A/B testing, clinical trials, risk assessment, and probabilistic machine learning.
157
158
View the interactive notebook with full code and visualizations:
159
https://cocalc.com/github/Ok-landscape/computational-pipeline/blob/main/notebooks/published/bayesian_inference_mcmc.ipynb
160
161
#BayesianStatistics #DataScience #MachineLearning #Python #MCMC #Statistics
162
163
--------------------------------------------------------------------------------
164
165
### Instagram (< 500 chars, assumes plot.png as image)
166
--------------------------------------------------------------------------------
167
Bayesian inference visualized
168
169
These plots show a computer learning from data.
170
171
Top: Watch the algorithm explore parameter space over 50,000 steps
172
173
Middle: The final answer isn't a single number - it's a probability distribution showing all plausible values
174
175
Bottom: Every point is a sample from the "posterior" - our updated beliefs after seeing data
176
177
The red stars mark the true hidden values. The algorithm found them.
178
179
This is how modern AI quantifies uncertainty.
180
181
.
182
.
183
.
184
#bayesian #statistics #datascience #python #mcmc #machinelearning #datavisualization #coding #probability #science
185
186
--------------------------------------------------------------------------------
187
188