| Hosted by CoCalc | Download
Kernel: R (R-Project)

RStan in CoCalc (Kernel: "R-Project")

https://mc-stan.org/users/interfaces/rstan

Going through https://github.com/stan-dev/rstan/wiki/RStan-Getting-Started with slight changes for CoCalc.

Be aware, that the initial compilation takes about 2.5 gb of RAM!

# switch to PNG images, because the default of SVG produces very large files options(jupyter.plot_mimetypes = c('image/png'))
pkgbuild::has_build_tools(debug = TRUE)
Trying to compile a simple C file
Running /usr/lib/R/bin/R CMD SHLIB foo.c gcc -std=gnu99 -I/usr/share/R/include -DNDEBUG -fpic -g -O2 -fdebug-prefix-map=/build/r-base-AitvI6/r-base-3.4.4=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -g -c foo.c -o foo.o g++ -shared -L/usr/lib/R/lib -Wl,-Bsymbolic-functions -Wl,-z,relro -o foo.so foo.o -L/usr/lib/R/lib -lR
TRUE
packageDescription("rstan")$Version
'2.18.2'
require(rstan)
Loading required package: rstan Loading required package: ggplot2 Loading required package: StanHeaders rstan (Version 2.18.2, GitRev: 2e1f913d3ca3) For execution on a local, multicore CPU with excess RAM we recommend calling options(mc.cores = parallel::detectCores()). To avoid recompilation of unchanged Stan programs, we recommend calling rstan_options(auto_write = TRUE)
# make sure to only use one core! options(mc.cores = 1)
rstan_options(auto_write = TRUE)
schools8 <- " data { int<lower=0> J; // number of schools real y[J]; // estimated treatment effects real<lower=0> sigma[J]; // standard error of effect estimates } parameters { real mu; // population treatment effect real<lower=0> tau; // standard deviation in treatment effects vector[J] eta; // unscaled deviation from mu by school } transformed parameters { vector[J] theta = mu + tau * eta; // school treatment effects } model { target += normal_lpdf(eta | 0, 1); // prior log-density target += normal_lpdf(y | theta, sigma); // log-likelihood }"
schools_dat <- list(J = 8, y = c(28, 8, -3, 7, -1, 1, 18, 12), sigma = c(15, 10, 16, 11, 9, 11, 10, 18) )
fit <- stan(model_code = schools8, data = schools_dat)
SAMPLING FOR MODEL '77851864eaef144f4a34884224755b9c' NOW (CHAIN 1). Chain 1: Chain 1: Gradient evaluation took 1.3e-05 seconds Chain 1: 1000 transitions using 10 leapfrog steps per transition would take 0.13 seconds. Chain 1: Adjust your expectations accordingly! Chain 1: Chain 1: Chain 1: Iteration: 1 / 2000 [ 0%] (Warmup) Chain 1: Iteration: 200 / 2000 [ 10%] (Warmup) Chain 1: Iteration: 400 / 2000 [ 20%] (Warmup) Chain 1: Iteration: 600 / 2000 [ 30%] (Warmup) Chain 1: Iteration: 800 / 2000 [ 40%] (Warmup) Chain 1: Iteration: 1000 / 2000 [ 50%] (Warmup) Chain 1: Iteration: 1001 / 2000 [ 50%] (Sampling) Chain 1: Iteration: 1200 / 2000 [ 60%] (Sampling) Chain 1: Iteration: 1400 / 2000 [ 70%] (Sampling) Chain 1: Iteration: 1600 / 2000 [ 80%] (Sampling) Chain 1: Iteration: 1800 / 2000 [ 90%] (Sampling) Chain 1: Iteration: 2000 / 2000 [100%] (Sampling) Chain 1: Chain 1: Elapsed Time: 0.039216 seconds (Warm-up) Chain 1: 0.038589 seconds (Sampling) Chain 1: 0.077805 seconds (Total) Chain 1: SAMPLING FOR MODEL '77851864eaef144f4a34884224755b9c' NOW (CHAIN 2). Chain 2: Chain 2: Gradient evaluation took 9e-06 seconds Chain 2: 1000 transitions using 10 leapfrog steps per transition would take 0.09 seconds. Chain 2: Adjust your expectations accordingly! Chain 2: Chain 2: Chain 2: Iteration: 1 / 2000 [ 0%] (Warmup) Chain 2: Iteration: 200 / 2000 [ 10%] (Warmup) Chain 2: Iteration: 400 / 2000 [ 20%] (Warmup) Chain 2: Iteration: 600 / 2000 [ 30%] (Warmup) Chain 2: Iteration: 800 / 2000 [ 40%] (Warmup) Chain 2: Iteration: 1000 / 2000 [ 50%] (Warmup) Chain 2: Iteration: 1001 / 2000 [ 50%] (Sampling) Chain 2: Iteration: 1200 / 2000 [ 60%] (Sampling) Chain 2: Iteration: 1400 / 2000 [ 70%] (Sampling) Chain 2: Iteration: 1600 / 2000 [ 80%] (Sampling) Chain 2: Iteration: 1800 / 2000 [ 90%] (Sampling) Chain 2: Iteration: 2000 / 2000 [100%] (Sampling) Chain 2: Chain 2: Elapsed Time: 0.042113 seconds (Warm-up) Chain 2: 0.043538 seconds (Sampling) Chain 2: 0.085651 seconds (Total) Chain 2: SAMPLING FOR MODEL '77851864eaef144f4a34884224755b9c' NOW (CHAIN 3). Chain 3: Chain 3: Gradient evaluation took 8e-06 seconds Chain 3: 1000 transitions using 10 leapfrog steps per transition would take 0.08 seconds. Chain 3: Adjust your expectations accordingly! Chain 3: Chain 3: Chain 3: Iteration: 1 / 2000 [ 0%] (Warmup) Chain 3: Iteration: 200 / 2000 [ 10%] (Warmup) Chain 3: Iteration: 400 / 2000 [ 20%] (Warmup) Chain 3: Iteration: 600 / 2000 [ 30%] (Warmup) Chain 3: Iteration: 800 / 2000 [ 40%] (Warmup) Chain 3: Iteration: 1000 / 2000 [ 50%] (Warmup) Chain 3: Iteration: 1001 / 2000 [ 50%] (Sampling) Chain 3: Iteration: 1200 / 2000 [ 60%] (Sampling) Chain 3: Iteration: 1400 / 2000 [ 70%] (Sampling) Chain 3: Iteration: 1600 / 2000 [ 80%] (Sampling) Chain 3: Iteration: 1800 / 2000 [ 90%] (Sampling) Chain 3: Iteration: 2000 / 2000 [100%] (Sampling) Chain 3: Chain 3: Elapsed Time: 0.048881 seconds (Warm-up) Chain 3: 0.038686 seconds (Sampling) Chain 3: 0.087567 seconds (Total) Chain 3: SAMPLING FOR MODEL '77851864eaef144f4a34884224755b9c' NOW (CHAIN 4). Chain 4: Chain 4: Gradient evaluation took 7e-06 seconds Chain 4: 1000 transitions using 10 leapfrog steps per transition would take 0.07 seconds. Chain 4: Adjust your expectations accordingly! Chain 4: Chain 4: Chain 4: Iteration: 1 / 2000 [ 0%] (Warmup) Chain 4: Iteration: 200 / 2000 [ 10%] (Warmup) Chain 4: Iteration: 400 / 2000 [ 20%] (Warmup) Chain 4: Iteration: 600 / 2000 [ 30%] (Warmup) Chain 4: Iteration: 800 / 2000 [ 40%] (Warmup) Chain 4: Iteration: 1000 / 2000 [ 50%] (Warmup) Chain 4: Iteration: 1001 / 2000 [ 50%] (Sampling) Chain 4: Iteration: 1200 / 2000 [ 60%] (Sampling) Chain 4: Iteration: 1400 / 2000 [ 70%] (Sampling) Chain 4: Iteration: 1600 / 2000 [ 80%] (Sampling) Chain 4: Iteration: 1800 / 2000 [ 90%] (Sampling) Chain 4: Iteration: 2000 / 2000 [100%] (Sampling) Chain 4: Chain 4: Elapsed Time: 0.043248 seconds (Warm-up) Chain 4: 0.054335 seconds (Sampling) Chain 4: 0.097583 seconds (Total) Chain 4:
Warning message: “There were 3 divergent transitions after warmup. Increasing adapt_delta above 0.8 may help. See http://mc-stan.org/misc/warnings.html#divergent-transitions-after-warmup”Warning message: “Examine the pairs() plot to diagnose sampling problems ”
print(fit)
Inference for Stan model: 77851864eaef144f4a34884224755b9c. 4 chains, each with iter=2000; warmup=1000; thin=1; post-warmup draws per chain=1000, total post-warmup draws=4000. mean se_mean sd 2.5% 25% 50% 75% 97.5% n_eff Rhat mu 7.86 0.12 5.11 -2.45 4.61 7.90 11.15 17.70 1857 1 tau 6.54 0.13 5.38 0.29 2.44 5.31 9.20 20.25 1599 1 eta[1] 0.40 0.01 0.95 -1.51 -0.24 0.43 1.05 2.18 4156 1 eta[2] 0.01 0.01 0.86 -1.69 -0.57 0.00 0.59 1.77 4138 1 eta[3] -0.21 0.01 0.92 -2.05 -0.82 -0.21 0.40 1.62 5337 1 eta[4] -0.01 0.01 0.90 -1.78 -0.59 0.00 0.56 1.78 4212 1 eta[5] -0.35 0.01 0.88 -2.07 -0.94 -0.35 0.23 1.41 4105 1 eta[6] -0.20 0.01 0.87 -1.84 -0.78 -0.21 0.37 1.58 4185 1 eta[7] 0.33 0.01 0.87 -1.46 -0.22 0.34 0.92 2.00 4145 1 eta[8] 0.06 0.01 0.92 -1.79 -0.56 0.07 0.69 1.81 4672 1 theta[1] 11.53 0.15 8.29 -2.12 6.08 10.50 15.77 31.26 3107 1 theta[2] 7.91 0.09 6.24 -4.38 4.03 7.78 11.76 20.43 4724 1 theta[3] 6.14 0.12 7.77 -11.13 2.17 6.57 10.92 20.15 3935 1 theta[4] 7.68 0.09 6.57 -5.30 3.71 7.67 11.84 20.92 5068 1 theta[5] 5.18 0.10 6.48 -8.82 1.20 5.63 9.52 16.46 3825 1 theta[6] 6.16 0.10 6.54 -7.89 2.27 6.53 10.52 18.08 4642 1 theta[7] 10.59 0.11 6.74 -1.56 6.09 9.93 14.54 25.98 3714 1 theta[8] 8.37 0.13 7.94 -7.34 3.71 8.28 12.72 25.05 3846 1 lp__ -39.49 0.08 2.61 -45.37 -41.08 -39.26 -37.61 -35.10 1208 1 Samples were drawn using NUTS(diag_e) at Thu Jan 10 17:15:29 2019. For each parameter, n_eff is a crude measure of effective sample size, and Rhat is the potential scale reduction factor on split chains (at convergence, Rhat=1).
plot(fit)
'pars' not specified. Showing first 10 parameters by default. ci_level: 0.8 (80% intervals) outer_level: 0.95 (95% intervals)
Image in a Jupyter notebook
pairs(fit, pars = c("mu", "tau", "lp__"))
Image in a Jupyter notebook
### return an array of three dimensions: iterations, chains, parameters a <- extract(fit, permuted = FALSE) a[1:10]
  1. 5.39618350392046
  2. 0.686046105579853
  3. 12.2207867201285
  4. -0.178791454000923
  5. 11.0161502498223
  6. 5.34968907655748
  7. 6.93197826928302
  8. 5.68811738463849
  9. 10.7213844859233
  10. 1.68269235068929
### use S3 functions on stanfit objects a2 <- as.array(fit) m <- as.matrix(fit) d <- as.data.frame(fit) d[1:10]
mutaueta[1]eta[2]eta[3]eta[4]eta[5]eta[6]eta[7]eta[8]
5.3961835 2.3685829 -0.5567546 0.56348542 2.01522095 0.169342623 0.50181571 -0.208145098-1.1217359 1.91946027
0.6860461 4.1367774 -1.0802441 0.42963362 1.99876081 0.887993281 1.20276288 -0.240802263-0.9450431 2.28220115
12.2207867 0.5989557 -1.3517758 -1.50313373 -1.26931261 0.090716307-0.95829548 0.395428980-0.1213342 -1.29870404
-0.1787915 1.2749352 -0.3583841 -0.59696618 0.99716760 -0.890219718-0.92933774 -1.007676229-0.7057188 0.57632729
11.0161502 1.7843548 -0.2232293 0.04109137 0.45007966 -0.438513460-1.30454790 -1.048199309-0.4806426 -0.64408486
5.3496891 4.6905826 -0.9736077 0.54301954 0.86772042 0.500115637-0.45801685 -1.452564628 2.3787168 0.17005574
6.9319783 5.3027711 1.3647102 0.08618644 -2.15494141 -1.745484748 0.68330836 0.570792391-0.3023055 0.43175128
5.6881174 1.6164703 -1.8529150 0.21515663 0.91532076 0.189292082-0.14087814 -0.067037906-0.1208038 0.62964552
10.7213845 8.9996004 -1.9365577 0.13085916 1.77016072 -0.003299785-0.86071803 0.473711690 0.2658253 1.45396565
1.6826924 0.1691010 1.1240211 0.39374889 0.42215091 -0.482868850-1.04063432 0.049669334-1.4084845 0.45945727
14.7804757 3.8119272 -0.6679941 -0.05020627 -0.13868226 1.116695069 0.01027008 -0.685884853 0.6550268 0.23060747
10.1087903 2.4634371 -2.5762966 -0.68724688 -1.86960612 -0.279329749-1.26457829 -2.805158102 0.1750718 -0.09148728
24.0811459 1.8356451 -2.9145325 1.11611710 -0.94643350 0.319774308 1.34581207 0.165335433-0.2260273 1.09373800
-7.4851807 6.6397290 3.3438356 -1.06011093 0.87519429 0.473639239-1.70560951 -0.529120226 0.4812363 -1.04239608
-0.3013699 13.9068516 2.3604325 0.85464726 -0.22416280 0.651395582-0.30136386 0.343451168 2.0920786 0.78442915
11.4391144 11.6837877 -1.3756533 -0.70986594 -0.54579781 -0.644570508-0.48813750 0.009779451-0.1877553 0.52216926
13.4169475 14.3643945 1.8847251 -0.18508920 0.17401990 0.182930297-1.17503408 -1.308482204 0.7070368 -0.13166273
8.7048968 6.7834231 0.4628655 -0.08120005 -0.21238590 0.162767090-0.95412743 -0.427018160-1.0112350 -1.44880478
6.6540698 4.9487404 1.3854939 0.96121890 -0.26556847 0.195825372 0.49529749 0.328893927 1.0500572 1.55579236
-2.5387590 16.4575783 0.3717619 0.50999642 0.96570505 -0.157000076 0.47046243 -0.050642907 1.3414407 1.46431963
-2.5387590 16.4575783 0.3717619 0.50999642 0.96570505 -0.157000076 0.47046243 -0.050642907 1.3414407 1.46431963
16.3980514 13.5989202 1.4541320 -0.39967342 -1.20405390 0.098571305-1.23094236 -0.824934046-0.1621034 -1.10090304
14.4847117 13.7989773 0.2092602 0.12059297 -1.49481657 -0.689796320-0.37982431 -0.958498537-0.3457315 -0.13843692
12.3469263 2.0297331 0.4319855 -1.19744527 0.24561412 -0.072166641-1.15542862 0.548836712 0.6966082 0.01814399
12.4287394 0.9902857 1.1037543 -1.27847109 -0.43530026 0.494160859 0.26845793 0.157840373 0.6658382 0.60015112
10.6641490 2.5483680 -0.7839886 -0.77121420 -1.35230236 0.549695398-0.04613958 -1.733963274 0.2421487 0.28274715
10.6199698 0.2835570 -0.2917623 0.19324122 -0.46244387 0.782276174 0.25997656 -0.539997086 0.4743405 -0.29992826
8.8833887 0.5482013 -0.1314144 -0.48854279 -0.34184416 -0.112882179 1.03627536 0.698999798 1.2983051 -0.47556639
3.3082875 1.5641125 0.4397713 -0.21005412 0.99517117 -1.022978212-0.55439391 1.317178312-0.1755756 -0.26944904
7.6743617 7.6847889 0.6286669 -0.77196167 0.06247897 -1.503346589-0.66778958 0.600880524 1.4895017 0.61987915
9.886547 8.7464343 1.8607923 -0.155244942-1.92769611 -1.1736117 -1.05415111 0.86365921 -0.78806516 0.36300414
11.364010 4.7500489 -1.1666333 0.007529395-0.61787671 -1.4561706 -1.60222744 -0.10814093 -0.85543032 0.83673123
12.922030 2.4673147 -0.8138930 -0.638404194-1.16309929 -1.3853016 -2.07147697 -0.11207489 -0.99742529 0.98280938
9.004027 3.4505590 -0.5446533 0.185183857 0.19112569 -2.2194860 -0.16581317 1.00831837 -0.60580365 1.42806526
8.532313 3.6696136 -1.2224762 -0.485543666-1.25071786 -1.1871044 0.06247177 0.94131544 0.10498538 0.77952432
9.308727 4.3011331 0.7358349 -0.431700486 0.32954572 -0.2472215 -1.04841285 -0.64676165 -1.22708457 -0.33212684
15.159718 2.7690557 1.0476665 -0.264712384-0.50345320 -1.4397583 -1.53447871 -0.82564645 1.45288789 0.07194064
14.807279 5.5387126 0.2781911 -0.232648400-1.20349057 -1.2788765 -2.20007530 -0.74980024 1.75414821 -0.21797300
14.354266 1.4427400 0.4608208 -0.698071530 0.05452116 0.4823497 0.13507947 -0.36418955 -2.02677862 -0.07404370
8.731571 1.8346451 0.3800160 -0.581235586-0.32387452 1.1167866 0.58360983 -1.26223659 -1.85069180 -0.23949578
8.141408 13.5718821 0.9911613 0.479031309-0.13500755 0.3051145 -1.19999842 0.33299510 2.28823572 -0.68887381
5.223351 9.4409972 1.1392715 0.638647869-0.76410606 0.1237237 -0.90769248 -0.13822845 1.41619995 -0.74254328
4.175857 9.7919858 1.6459312 -0.225571376 1.23100186 0.7472079 0.46678288 -0.61453897 -0.04342235 0.19795917
13.516229 2.4123927 -0.5653916 -0.506821106-0.86345996 -1.4461655 -0.37638387 0.25572666 0.27263606 0.12147005
8.195783 0.4985395 -0.1596514 -0.227068158-2.20298982 -1.5556816 -0.50401496 -0.82348689 -1.69214290 1.67993505
5.724590 0.5074447 0.1463413 0.777902171-0.88910865 -0.5522545 -0.03084518 0.25136057 -0.54789818 2.38617182
7.189569 12.4709405 -0.2319947 0.997506062 0.11218081 -1.4477460 -0.84611043 -0.04976682 0.60869760 -0.30506043
11.821388 5.0489694 1.5068324 -1.589650827-0.85868506 1.7814902 -0.22197141 -0.67997223 -0.01054010 0.72218861
6.991630 6.1293814 -0.1149275 -0.620351195-0.78686414 -1.5704500 -0.77792137 0.99757089 0.92422567 -0.90226504
8.394466 1.9009057 0.2836753 1.113406309 0.63902720 1.5701933 0.35866455 -1.35776173 -0.36234836 -0.04246596
8.227097 13.8285292 -0.1128386 -0.864252583-1.04510557 -1.2310453 -0.32135086 -0.18709069 0.45175475 -1.08077260
4.756219 4.8510982 1.0809501 1.075581321 0.47707824 1.5348927 -0.59170947 -0.44832337 0.60559299 1.37185410
9.125994 4.6916668 0.7902582 -0.360020171-1.13269192 -0.9154454 -0.19690435 -0.42695352 1.23209184 0.68186416
10.401728 8.5582942 0.9907257 0.316102757-1.12876402 0.4033975 -0.64321177 -0.50959635 -0.44911528 -0.39420160
11.755744 19.4218130 -0.2288889 -0.470286217-0.53298429 0.2498006 -1.07497742 -0.05123380 0.52598584 0.75638694
8.386704 24.7611511 1.2541720 0.265327917 0.03457989 -0.4722916 0.19660066 -0.45530531 0.35719173 -0.63764690
5.201969 5.9500500 0.7144015 -0.497214555-0.56814769 0.3120196 -1.06307262 0.06060043 0.18038785 -1.69056192
11.117786 6.2049786 0.5332541 0.610885496-0.25031408 0.0660660 0.11059952 -0.89447537 0.68927347 1.31855236
7.675931 17.1808587 0.5752803 -0.043668107-0.20804228 -0.3081718 -0.81270383 0.24382377 -0.15226967 -1.74742574
9.331831 7.7080905 0.9305285 -0.810825690-1.19654748 0.2710335 -0.11800252 -0.94605939 0.96818966 0.87161380