Path: blob/master/notebooks/book2/04/gibbs_demo_potts.ipynb
1193 views
Kernel: Python 3.7.13 ('py3713')
Gibbs sampling for a Potts model on a 2d lattice
Ming Liang Ang.
The math behind the model
The potts model
In order to efficiently compute for all the different states in our potts model we use a convolution. The idea is to first reperesent each potts model state as a one-hot state and then apply a convolution to compute the logits.
An example
Where the matrix correspond to the number of neighbours with the same value around in the matrix
For more than 2 states, we represent the above matrix as a 3d tensor which you can imagine as the state matrix but with each element as a one hot vector.
Import libaries
In [1]:
RNG key
In [2]:
Out[2]:
WARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)
The number of states and size of the 2d grid
In [3]:
The convolutional kernel for computing energy of markov blanket of each node
In [4]:
Creating the checkerboard
In [5]:
In [6]:
In [7]:
In [12]:
Running the test
In [13]:
Running the model
In [14]:
In [15]:
Out[15]:
100%|██████████| 2/2 [00:00<00:00, 7.96it/s]
DeviceArray([[9, 3, 4, ..., 5, 9, 9],
[3, 2, 3, ..., 3, 8, 5],
[4, 0, 3, ..., 3, 7, 6],
...,
[5, 5, 5, ..., 8, 2, 5],
[5, 5, 7, ..., 3, 2, 3],
[4, 9, 5, ..., 6, 7, 3]], dtype=int32)
In [17]:
Out[17]:
100%|██████████| 8000/8000 [01:15<00:00, 105.39it/s]
100%|██████████| 8000/8000 [01:15<00:00, 106.53it/s]
100%|██████████| 8000/8000 [01:14<00:00, 107.59it/s]
100%|██████████| 3/3 [03:45<00:00, 75.14s/it]