I’ve been learning how to use JAGS for Bayesian hierarchical modeling, moved by the great teaching of John Kruschke, Peter Congdon, Andrew Gelman, and many others. So, I went on to solve a problem with JAGS (“Just Another Gibbs Sampler”).
The user Henrik on Wikipedia is currently caretaker (“gardener”) of a statistics database of Wikipedia accesses. I sought to find out how many times Wikipedia had been queried for the keyword “soccer”, and I obtained from it the time series depicted in Figure A, below,
and pulled into R where it looks like Figure B.
The counts are daily counts of accesses and there is clearly a weekly pattern, although 7th December 2013 has a clear spike. The problem I wanted to solve was to classify days within each week as High or Low, so I wanted to associate the count of each day with one of two clusters. Labeling days with such a pattern may be useful in different ways. Identifying weekends and holidays, for example, might indicate a cultural preference, which, in combination with other evidence, could be used to constrain location of the behavior.
In addition, these counts are positive and discrete-valued, so Gaussian approximations are probably not a good idea. Moreover, Poisson approximations probably aren’t a good idea either, since the data are overdispersed, having a mean of 626.8 and a variance of 10423. Poisson random variables should have mean and variance which are equal. It is highly typical that counts violate a Poisson assumption in this, overdispersed way, although it is also possible for then to be underdispersed.
Instead of a Poisson, I chose to use a Negative Binomial (“NB”), and took advice from John Kruschke on parameterization. Not only, then, is a probability of incidence a parameter, but also the “number of successes” (or “failures”) in the typical conceptual setup for the NB. In fact, there is a parallel between the NB and the Binomial distribution, except that in the former, the number of trials is itself a random variable. So, in fact, there are two parameters when modeling with the NB, and that, in short, is why it can be used to model overdispersion. Note, too,
The hierarchical model I used seems straightforward, apart from experimenting with hyperprior constants. The code is provided below, but I hightlight the BUGS model here:
#------------------------------------------------------------------------------ # THE MODEL. # var n.d[8], y[7,8], p.d[7, 8], s.d[7,8], b.hi, b.lo; modelstring<- " model{ # b.lo ~ dbeta(1,8) # b.hi ~ dbeta(8,1) b.lo ~ dbeta(1,12) b.hi ~ dbeta(12,1) for ( i in 1 : 8 ){ sqrt.lambda.w[i]<- sqrt(lambda.w[i]) inv.sqrt.lambda.w[i]<- 1/sqrt.lambda.w[i] n.d[i] ~ dgamma( sqrt.lambda.w[i], inv.sqrt.lambda.w[i] ) for ( j in 1 : 7 ){ s.d[j, i] ~ dbern(0.714285) p.d[j, i] <- s.d[j, i]*b.hi + (1 - s.d[j, i])*b.lo # JAGS definition of the Negative Binomial uses a 1-p # compared to common convention, e.g., http://en.wikipedia.org/wiki/Negative_binomial#Definition y[j, i] ~ dnegbin( (1 - p.d[j, i]), n.d[i] ) } } #inits# .RNG.name, .RNG.seed #data# y, lambda.w #parameters to 'checkpoint.data'# } " # close quote for modelstring #------------------------------------------------------------------------------
Days are modelled as days in weeks, so y[j, i]
denotes the count of accesses on day j
of week i
. The same kind of indexing applies to random variables s.d
and p.d
. Random variable n.d
, describing the “number of successes” for the NB for a given week is indexed just by [i]
. Their common hyperprior is a broad Gamma density, having a parameter estimated from the number of accesses per day on average during the week, lambda.w[i]
.
The basic assessment of High versus Low is done in terms of the “success probability” for the day, p.d[j, i]
. Its prior is a mixture of two Beta densities, a Beta(12,1) or a Beta(1,12), with the mixing being done using a Bernoulli hyperprior, s.d[j, i]
whose hyperparameter is just 5/7, meaning having the expectation that five days of the week will be “higher” in activity than the other two. This could be wrong … It might be that 5 days out of 7 are lower than the other two, for instance.
JAGS is run using the runjags package. I ran 5 chains, each with 50000 burn-in steps, a sample of 50000, thinning of 10, and “adapt” set to 1000. (It’s in the code, but summarise
was TRUE
, confidence
was 0.95, psrf.target
was 1.05, normalise.mcmc
was TRUE
, check.stochastic
was TRUE
, check.conv
was TRUE
, and the method
was parallel
, running on a Win7 machine with 4 cores.
The results for p.d[.,3]
are shown below, in Figure C, intending to be typical and representative of results from other weeks:
If a rule that a Low or High median density value for p.d[j, i]
surrounded by a 90% ROPE needs to exclude 0.5 in order to be a call in order to be a fair call of a Low or High day is used, the classification obtained from this analysis is depicted in Figure D, below:
Question marks indicate days where the ROPE includes 0.5. Because whole weeks are needed for analysis, only the first 56 days of those shown in Figures A and B are marked.
What’s neat about the Bayesian approach to the problem is that questionable classifications can be easily seen in the statistics of parameters, here, the estimates for p.d[j, i]
. One unusual feature of Figure D is the classification of days in Week 7, where one dominating day captures all the High mass and the remainder of the days are relegated to being Lows. There might be an alternative model which “borrows strength” (to use Tukey’s term for Bayesian models) from other weeks to do a better job on Week 7, but to the degree each week is judged on its own, the classification performance here is defensible, in my opinion.
The listing of the R code used to do this is shown below:
# Modified empirical_bayesian@ieee.org, Jan Galkowski, 19th December 2013. # Last changed on 24th December 2013. graphics.off() rm(list=ls(all=TRUE)) # Randomize the seed wonkyRandom<- sample.int(2000000, size=sample.int(200000, size=1), replace=TRUE) rm(list=c("wonkyRandom")) fileNameRoot="DaysOfWeekHierarchialModelJags" # for constructing output filenames constructFilenameFrom<- function(root, suffix=".pdf") { stamp<- Sys.time() stamp<- gsub(x=stamp, pattern="(-|:)", replacement="", fixed=FALSE) stamp<- gsub(x=stamp, pattern=" ", replacement="-", fixed=TRUE) return(sprintf("%s-%s%s", root, stamp, suffix)) } #source("openGraphSaveGraph.R") source("plotPost.R") pause.with.message<- function (message) { # Pause with a message between plots cat(message) cat("\n") cat("Paused. Press to continue...") readline() invisible() } require(rjags) # Kruschke, J. K. (2011). Doing Bayesian Data Analysis: # A Tutorial with R and BUGS. Academic Press / Elsevier. require(runjags) require(coda) #------------------------------------------------------------------------------ # THE MODEL. # var n.d[8], y[7,8], p.d[7, 8], s.d[7,8], b.hi, b.lo; modelstring<- " model{ # b.lo ~ dbeta(1,8) # b.hi ~ dbeta(8,1) b.lo ~ dbeta(1,12) b.hi ~ dbeta(12,1) for ( i in 1 : 8 ){ sqrt.lambda.w[i]<- sqrt(lambda.w[i]) inv.sqrt.lambda.w[i]<- 1/sqrt.lambda.w[i] n.d[i] ~ dgamma( sqrt.lambda.w[i], inv.sqrt.lambda.w[i] ) for ( j in 1 : 7 ){ s.d[j, i] ~ dbern(0.714285) p.d[j, i] <- s.d[j, i]*b.hi + (1 - s.d[j, i])*b.lo # JAGS definition of the Negative Binomial uses a 1-p # compared to common convention, e.g., http://en.wikipedia.org/wiki/Negative_binomial#Definition y[j, i] ~ dnegbin( (1 - p.d[j, i]), n.d[i] ) } } #inits# .RNG.name, .RNG.seed #data# y, lambda.w #parameters to 'checkpoint.data'# } " # close quote for modelstring #------------------------------------------------------------------------------ # THE DATA. # Load the soccer data: load( "soccer60.Rdata" ) # loads dataMat # Specify data, as a list. orderedByWeeks<- matrix(soccer56[,"count"], nrow=8, ncol=7, byrow=TRUE) # Data must be column major order dataList = list(y=t(orderedByWeeks), lambda.w=as.vector(rowMeans(orderedByWeeks))) #------------------------------------------------------------------------------ # INTIALIZE THE CHAINS. .RNG.name<- function(chain) { return("base::Mersenne-Twister") } .RNG.seed<- function(chain) { return( sample.int(2000000, size=1) ) } # Run the model and produce plots results<- run.jags(model=modelstring, # monitor=c("deviance", "pd", "pd.i", "popt", "dic", "n.d", "p.d"), #Only accessible when method="interruptible" monitor=c("n.d", "p.d"), silent.jags=FALSE, datalist=dataList, n.chains=5, inits=NA, burnin=50000, sample=50000, adapt=1000, thin=10, summarise=TRUE, confidence=0.95, plots=TRUE, psrf.target=1.05, normalise.mcmc=TRUE, check.stochastic=TRUE, check.conv=TRUE, keep.jags.files=TRUE, jags.refresh=0.1, batch.jags=FALSE, method="parallel") #------------------------------------------------------------------------------ # EXAMINE THE RESULTS print(results) cat("##################\n") cat("# Summary #\n") cat("##################\n") print(summary(object=as.mcmc(results), quantiles=c(0.025, 0.05, 0.075, 0.50, 0.925, 0.95, 0.975))) cat("##################\n") cat("# End of Summary #\n") cat("##################\n") plot(x=results, type="all", file=constructFilenameFrom(root=fileNameRoot, suffix=".pdf")) #------------------------------------------------------------------------------
The console trace of one run of this code is included below:
R version 3.0.2 (2013-09-25) -- "Frisbee Sailing" Copyright (C) 2013 The R Foundation for Statistical Computing Platform: x86_64-w64-mingw32/x64 (64-bit) R is free software and comes with ABSOLUTELY NO WARRANTY. You are welcome to redistribute it under certain conditions. Type 'license()' or 'licence()' for distribution details. Natural language support but running in an English locale R is a collaborative project with many contributors. Type 'contributors()' for more information and 'citation()' on how to cite R or R packages in publications. Type 'demo()' for some demos, 'help()' for on-line help, or 'help.start()' for an HTML browser interface to help. Type 'q()' to quit R. Welcome at Tue Dec 24 11:40:03 2013 > source("DaysOfWeekHierarchicalModel.R") Loading required package: rjags Loading required package: coda Loading required package: lattice Linked to JAGS 3.4.0 Loaded modules: basemod,bugs Loading required package: runjags Calling the simulation using the parallel method... Following the progress of chain 1 (the program will wait for all chains to finish before continuing): Welcome to JAGS 3.4.0 on Tue Dec 24 11:48:02 2013 JAGS is free software and comes with ABSOLUTELY NO WARRANTY Loading module: basemod: ok Loading module: bugs: ok . . Reading data file data.txt . Compiling model graph Resolving undeclared variables Allocating nodes Graph Size: 429 . Reading parameter file inits1.txt . Reading parameter file inits5.txt . Initializing model . Adapting 1000 -------------------------------------------------| 1000 ++++++++++++++++++++++++++++++++++++++++++++++++++ 100% Adaptation successful . Updating 50000 -------------------------------------------------| 50000 ************************************************** 100% . . . Updating 500000 -------------------------------------------------| 500000 ************************************************** 100% . . . . Updating 0 . Deleting model . All chains have finished Simulation complete. Reading coda files... Coda files loaded successfully JAGS files were saved to the 'runjagsfiles' folder in your current working directory Calculating the Gelman-Rubin statistic for 64 variables.... Note: Unable to calculate the multivariate psrf due to an error calculating the Gelman-Rubin statistic Convergence may have failed for this run for 6 parameters after 50000 iterations (Unable to calculate the multi-variate psrf) Finished running the simulation JAGS model summary statistics from 250000 samples (thin = 1 in 10; chains = 5; burnin = 50000): Lower95 Median Upper95 Mean SD MCerr MC%ofSD SSeff AC.100 psrf n.d[1] 529.86 616.4 709.22 617.71 45.783 0.47138 1 9433 0.45813 1.0027 n.d[2] 478.47 569.11 668.95 571.05 49.019 0.63974 1.3 5871 0.37981 1.1101 n.d[3] 502.98 589.63 681.54 590.82 45.701 0.46034 1 9856 0.41634 1.0091 n.d[4] 487.27 570.57 655.72 571.74 42.87 0.43702 1 9623 0.44581 1.001 n.d[5] 505.71 592.03 681.83 593.22 44.956 0.45389 1 9810 0.43916 1.0003 n.d[6] 452.37 531.34 614.1 532.34 41.242 0.40668 1 10284 0.42138 1.0006 n.d[7] 657.97 768.13 883.12 769.7 57.504 0.59813 1 9243 0.4564 1.0104 n.d[8] 489.4 570.56 656.14 571.77 42.602 0.43424 1 9625 0.4477 1.0007 p.d[1,1] 0.43032 0.46684 0.50293 0.4672 0.018753 0.0001948 1 9267 0.45751 1.0125 p.d[2,1] 0.50457 0.53977 0.57472 0.53986 0.017851 0.0001898 1.1 8846 0.48904 1.0011 p.d[3,1] 0.50457 0.53977 0.57472 0.53986 0.017849 0.00018979 1.1 8845 0.48901 1.0011 p.d[4,1] 0.50457 0.53977 0.57471 0.53986 0.017848 0.00018981 1.1 8842 0.48922 1.0012 p.d[5,1] 0.50457 0.53977 0.57471 0.53986 0.017848 0.00018981 1.1 8842 0.48922 1.0012 p.d[6,1] 0.50436 0.53973 0.57494 0.53975 0.018028 0.00018939 1.1 9061 0.47857 1.0008 p.d[7,1] 0.43142 0.46677 0.50303 0.46696 0.018252 0.00019463 1.1 8794 0.47975 1.0147 p.d[1,2] 0.43641 0.48409 0.56488 0.49536 0.038537 0.00071116 1.8 2936 0.1842 1.2396 p.d[2,2] 0.50122 0.53927 0.57745 0.53855 0.019934 0.00019061 1 10937 0.40468 1.0078 p.d[3,2] 0.50457 0.53977 0.57472 0.53986 0.017851 0.00018976 1.1 8849 0.48903 1.0011 p.d[4,2] 0.50459 0.53977 0.57473 0.53986 0.017849 0.00018979 1.1 8844 0.48915 1.0011 p.d[5,2] 0.45108 0.53292 0.57409 0.52439 0.033467 0.0005695 1.7 3454 0.23366 1.2476 p.d[6,2] 0.44068 0.51156 0.56908 0.50628 0.03912 0.00099994 2.6 1531 0.15895 1.4034 p.d[7,2] 0.43142 0.46677 0.50303 0.46696 0.018252 0.00019463 1.1 8794 0.47975 1.0147 p.d[1,3] 0.42841 0.46725 0.50644 0.46835 0.020755 0.00019726 1 11070 0.3836 1.0082 p.d[2,3] 0.43657 0.48512 0.56587 0.49624 0.03897 0.00049826 1.3 6117 0.12938 1.1239 p.d[3,3] 0.50465 0.53976 0.5749 0.53984 0.017886 0.00018875 1.1 8979 0.48645 1.001 p.d[4,3] 0.50457 0.53977 0.57471 0.53986 0.017848 0.00018981 1.1 8842 0.48922 1.0012 p.d[5,3] 0.50465 0.53976 0.5749 0.53984 0.01789 0.00018964 1.1 8899 0.48635 1.001 p.d[6,3] 0.46413 0.53683 0.57765 0.53251 0.027295 0.00025643 0.9 11331 0.26262 1.0724 p.d[7,3] 0.43142 0.46677 0.50303 0.46696 0.018252 0.00019463 1.1 8794 0.47975 1.0146 p.d[1,4] 0.4219 0.46818 0.52648 0.47114 0.02491 0.00019689 0.8 16007 0.29348 1.0119 p.d[2,4] 0.50347 0.53952 0.57613 0.53925 0.01881 0.0001879 1 10020 0.43617 1.0011 p.d[3,4] 0.50459 0.53977 0.57473 0.53986 0.017849 0.00018979 1.1 8845 0.48916 1.0011 p.d[4,4] 0.50457 0.53977 0.57471 0.53986 0.017848 0.00018981 1.1 8842 0.48922 1.0012 p.d[5,4] 0.50459 0.53964 0.57586 0.53957 0.018286 0.00018865 1 9396 0.4628 1.0006 p.d[6,4] 0.50436 0.53972 0.57494 0.53974 0.018023 0.00018948 1.1 9048 0.47751 1.0008 p.d[7,4] 0.43142 0.46677 0.50303 0.46696 0.018252 0.00019463 1.1 8794 0.47975 1.0147 p.d[1,5] 0.43131 0.46678 0.50306 0.467 0.018327 0.00019457 1.1 8872 0.47612 1.0142 p.d[2,5] 0.50457 0.53975 0.57489 0.53982 0.01792 0.00018897 1.1 8992 0.48459 1.001 p.d[3,5] 0.50459 0.53977 0.57473 0.53986 0.017849 0.00018978 1.1 8845 0.48912 1.0011 p.d[4,5] 0.50456 0.53977 0.57472 0.53986 0.017855 0.00018977 1.1 8852 0.48873 1.0011 p.d[5,5] 0.50457 0.53977 0.57471 0.53986 0.017848 0.00018981 1.1 8842 0.48922 1.0012 p.d[6,5] 0.43186 0.4731 0.55807 0.48205 0.034284 0.00026596 0.8 16616 0.17883 1.0439 p.d[7,5] 0.43142 0.46677 0.50303 0.46696 0.018252 0.00019463 1.1 8794 0.47975 1.0147 p.d[1,6] 0.42191 0.468 0.51924 0.47052 0.02396 0.00019399 0.8 15255 0.31821 1.0132 p.d[2,6] 0.50459 0.53977 0.57473 0.53986 0.017849 0.0001898 1.1 8843 0.48915 1.0011 p.d[3,6] 0.50438 0.53973 0.57487 0.53977 0.01799 0.00018952 1.1 9011 0.47982 1.0009 p.d[4,6] 0.50457 0.53975 0.57494 0.5398 0.017947 0.00018983 1.1 8938 0.48325 1.001 p.d[5,6] 0.43277 0.47409 0.56006 0.48393 0.035406 0.00029105 0.8 14799 0.16868 1.0495 p.d[6,6] 0.43122 0.46679 0.50308 0.46702 0.018374 0.00019456 1.1 8919 0.47426 1.0139 p.d[7,6] 0.43142 0.46678 0.50305 0.46697 0.018261 0.00019464 1.1 8803 0.47931 1.0146 p.d[1,7] 0.43142 0.46678 0.50303 0.46697 0.018254 0.00019462 1.1 8797 0.47958 1.0146 p.d[2,7] 0.43131 0.46678 0.50297 0.46698 0.018275 0.00019456 1.1 8823 0.47871 1.0145 p.d[3,7] 0.43131 0.46678 0.50297 0.46698 0.018275 0.00019455 1.1 8824 0.47873 1.0145 p.d[4,7] 0.431 0.4668 0.50303 0.46706 0.018456 0.00019467 1.1 8988 0.47039 1.0138 p.d[5,7] 0.431 0.4668 0.503 0.46705 0.018447 0.00019498 1.1 8951 0.47164 1.0138 p.d[6,7] 0.43131 0.46678 0.50297 0.46698 0.018276 0.00019457 1.1 8823 0.47872 1.0145 p.d[7,7] 0.50457 0.53977 0.57471 0.53986 0.017848 0.00018981 1.1 8842 0.48922 1.0012 p.d[1,8] 0.43131 0.46678 0.50298 0.46698 0.018286 0.00019461 1.1 8829 0.47793 1.0145 p.d[2,8] 0.50438 0.53977 0.57455 0.53985 0.017859 0.00018974 1.1 8860 0.4883 1.0011 p.d[3,8] 0.50438 0.53976 0.57457 0.53985 0.017867 0.00018978 1.1 8863 0.48791 1.0011 p.d[4,8] 0.50457 0.53977 0.57471 0.53986 0.017848 0.00018979 1.1 8844 0.48916 1.0011 p.d[5,8] 0.50457 0.53977 0.57471 0.53986 0.017848 0.00018981 1.1 8842 0.48922 1.0012 p.d[6,8] 0.4756 0.53793 0.58097 0.53506 0.024671 0.00020341 0.8 14710 0.28986 1.0336 p.d[7,8] 0.43142 0.46677 0.50303 0.46696 0.018252 0.00019463 1.1 8794 0.47972 1.0146 Total time taken: 17.5 minutes ################## # Summary # ################## Warning in as.mcmc.runjags(results) : Combining the 5 mcmc chains together Iterations = 50001:2549991 Thinning interval = 10 Number of chains = 1 Sample size per chain = 250000 1. Empirical mean and standard deviation for each variable, plus standard error of the mean: Mean SD Naive SE Time-series SE n.d[1] 617.71205050 45.783069687 9.1566139e-02 0.47137944844 n.d[2] 571.05484255 49.018858049 9.8037716e-02 0.63973589884 n.d[3] 590.81855770 45.701017228 9.1402034e-02 0.46033694818 n.d[4] 571.74399475 42.869710137 8.5739420e-02 0.43701670594 n.d[5] 593.22395088 44.956246869 8.9912494e-02 0.45389175784 n.d[6] 532.34160933 41.242221919 8.2484444e-02 0.40668315052 n.d[7] 769.70144074 57.504119810 1.1500824e-01 0.59812850936 n.d[8] 571.77470993 42.601959069 8.5203918e-02 0.43424289632 p.d[1,1] 0.46720282 0.018752820 3.7505640e-05 0.00019480002 p.d[2,1] 0.53985976 0.017851148 3.5702296e-05 0.00018979738 p.d[3,1] 0.53985998 0.017849459 3.5698918e-05 0.00018979412 p.d[4,1] 0.53986189 0.017847924 3.5695849e-05 0.00018981070 p.d[5,1] 0.53986189 0.017847924 3.5695849e-05 0.00018981070 p.d[6,1] 0.53975409 0.018027913 3.6055827e-05 0.00018939030 p.d[7,1] 0.46696401 0.018251535 3.6503070e-05 0.00019462926 p.d[1,2] 0.49536080 0.038537401 7.7074802e-05 0.00071116354 p.d[2,2] 0.53854560 0.019933653 3.9867306e-05 0.00019060875 p.d[3,2] 0.53986051 0.017850835 3.5701671e-05 0.00018976340 p.d[4,2] 0.53986131 0.017848635 3.5697270e-05 0.00018978876 p.d[5,2] 0.52439428 0.033467374 6.6934747e-05 0.00056949527 p.d[6,2] 0.50627751 0.039119995 7.8239991e-05 0.00099993643 p.d[7,2] 0.46696401 0.018251535 3.6503070e-05 0.00019462926 p.d[1,3] 0.46835351 0.020754989 4.1509978e-05 0.00019726090 p.d[2,3] 0.49624322 0.038970368 7.7940737e-05 0.00049826091 p.d[3,3] 0.53983713 0.017885848 3.5771696e-05 0.00018875376 p.d[4,3] 0.53986189 0.017847924 3.5695849e-05 0.00018981070 p.d[5,3] 0.53983517 0.017889859 3.5779718e-05 0.00018964174 p.d[6,3] 0.53250748 0.027295372 5.4590744e-05 0.00025642579 p.d[7,3] 0.46696430 0.018252175 3.6504351e-05 0.00019462979 p.d[1,4] 0.47114186 0.024910435 4.9820870e-05 0.00019689151 p.d[2,4] 0.53925122 0.018809628 3.7619257e-05 0.00018790469 p.d[3,4] 0.53986054 0.017848683 3.5697365e-05 0.00018978575 p.d[4,4] 0.53986189 0.017847924 3.5695849e-05 0.00018981070 p.d[5,4] 0.53957319 0.018285885 3.6571770e-05 0.00018864635 p.d[6,4] 0.53974435 0.018023232 3.6046465e-05 0.00018947515 p.d[7,4] 0.46696401 0.018251535 3.6503070e-05 0.00019462926 p.d[1,5] 0.46700043 0.018326679 3.6653358e-05 0.00019457182 p.d[2,5] 0.53981686 0.017919721 3.5839442e-05 0.00018897144 p.d[3,5] 0.53986096 0.017849064 3.5698129e-05 0.00018978341 p.d[4,5] 0.53985749 0.017855048 3.5710096e-05 0.00018977445 p.d[5,5] 0.53986189 0.017847924 3.5695849e-05 0.00018981070 p.d[6,5] 0.48205422 0.034283504 6.8567007e-05 0.00026596072 p.d[7,5] 0.46696401 0.018251535 3.6503070e-05 0.00019462926 p.d[1,6] 0.47052258 0.023960118 4.7920236e-05 0.00019399198 p.d[2,6] 0.53986135 0.017848641 3.5697283e-05 0.00018980214 p.d[3,6] 0.53977289 0.017990188 3.5980376e-05 0.00018952144 p.d[4,6] 0.53980435 0.017946829 3.5893658e-05 0.00018982800 p.d[5,6] 0.48392706 0.035406214 7.0812428e-05 0.00029105141 p.d[6,6] 0.46702246 0.018374339 3.6748678e-05 0.00019455522 p.d[7,6] 0.46696963 0.018261490 3.6522980e-05 0.00019463567 p.d[1,7] 0.46696576 0.018254026 3.6508052e-05 0.00019461678 p.d[2,7] 0.46697640 0.018274987 3.6549973e-05 0.00019455917 p.d[3,7] 0.46697667 0.018275249 3.6550497e-05 0.00019454953 p.d[4,7] 0.46705741 0.018455718 3.6911436e-05 0.00019467081 p.d[5,7] 0.46705129 0.018447161 3.6894321e-05 0.00019497738 p.d[6,7] 0.46697721 0.018276100 3.6552200e-05 0.00019456643 p.d[7,7] 0.53986189 0.017847924 3.5695849e-05 0.00018981070 p.d[1,8] 0.46698011 0.018286198 3.6572395e-05 0.00019461284 p.d[2,8] 0.53985380 0.017859298 3.5718596e-05 0.00018973543 p.d[3,8] 0.53984823 0.017866598 3.5733195e-05 0.00018978437 p.d[4,8] 0.53986138 0.017848315 3.5696630e-05 0.00018979064 p.d[5,8] 0.53986189 0.017847924 3.5695849e-05 0.00018981070 p.d[6,8] 0.53506013 0.024670711 4.9341423e-05 0.00020340853 p.d[7,8] 0.46696430 0.018252081 3.6504161e-05 0.00019462976 2. Quantiles for each variable: 2.5% 5% 7.5% 50% 92.5% 95% 97.5% n.d[1] 531.41590000 544.60195000 553.39700000 616.4030000 685.03715000 695.23725000 711.11002500 n.d[2] 480.66900000 493.86600000 502.56992500 569.1145000 643.71707500 654.98310000 671.66112500 n.d[3] 505.06297500 517.90285000 526.52092500 589.6265000 657.76122500 668.10905000 684.04105000 n.d[4] 490.62097500 503.24400000 511.42585000 570.5700000 634.60400000 644.16515000 659.35600000 n.d[5] 508.42085000 521.18200000 529.65685000 592.0350000 659.13807500 669.06815000 684.94607500 n.d[6] 454.19685000 466.30600000 474.11185000 531.3365000 592.71907500 601.66505000 616.16505000 n.d[7] 661.58297500 678.12000000 688.88600000 768.1255000 854.12000000 867.20200000 887.44022500 n.d[8] 491.64300000 503.76895000 511.75770000 570.5625000 634.25037500 643.84530000 658.74712500 p.d[1,1] 0.43171900 0.43726695 0.44087400 0.4668420 0.49376515 0.49793005 0.50440102 # L (annotation added, based upon 90% ROPE) p.d[2,1] 0.50507198 0.51066695 0.51423900 0.5397675 0.56567407 0.56949305 0.57530010 # H p.d[3,1] 0.50507198 0.51066695 0.51423900 0.5397670 0.56567407 0.56949400 0.57530403 # H p.d[4,1] 0.50507700 0.51067095 0.51424393 0.5397680 0.56567500 0.56949405 0.57530505 # H p.d[5,1] 0.50507700 0.51067095 0.51424393 0.5397680 0.56567500 0.56949405 0.57530505 # H p.d[6,1] 0.50467098 0.51042200 0.51406100 0.5397270 0.56565300 0.56947100 0.57528105 # H p.d[7,1] 0.43169000 0.43725400 0.44085300 0.4667750 0.49337507 0.49735610 0.50333502 # L p.d[1,2] 0.43763497 0.44366300 0.44778292 0.4840870 0.55426208 0.55911900 0.56622700 # ? p.d[2,2] 0.49743380 0.50703895 0.51178693 0.5392690 0.56539407 0.56921905 0.57505005 # H p.d[3,2] 0.50507200 0.51066700 0.51423993 0.5397680 0.56567500 0.56949405 0.57530505 # H p.d[4,2] 0.50507498 0.51066895 0.51424200 0.5397680 0.56567500 0.56949405 0.57530505 # H p.d[5,2] 0.44966797 0.45768795 0.46363085 0.5329200 0.56273908 0.56678210 0.57279805 # ? p.d[6,2] 0.44086697 0.44724395 0.45163792 0.5115645 0.55827000 0.56270805 0.56929400 # ? p.d[7,2] 0.43169000 0.43725400 0.44085300 0.4667750 0.49337507 0.49735610 0.50333502 # L p.d[1,3] 0.43188898 0.43742890 0.44106700 0.4672490 0.49598600 0.50111505 0.51220807 # ? p.d[2,3] 0.43727995 0.44340700 0.44758900 0.4851160 0.55483400 0.55965005 0.56671503 # ? p.d[3,3] 0.50497697 0.51061900 0.51420500 0.5397590 0.56567007 0.56949105 0.57529803 # H p.d[4,3] 0.50507700 0.51067095 0.51424393 0.5397680 0.56567500 0.56949405 0.57530505 # H p.d[5,3] 0.50496200 0.51061295 0.51419900 0.5397590 0.56567007 0.56949205 0.57529902 # H p.d[6,3] 0.45938490 0.47087995 0.48200077 0.5368350 0.56427400 0.56812500 0.57405400 # ? p.d[7,3] 0.43169000 0.43725400 0.44085300 0.4667750 0.49337600 0.49735815 0.50333602 # L p.d[1,4] 0.43219398 0.43779300 0.44145193 0.4681800 0.50446515 0.52106805 0.54334600 # ? p.d[2,4] 0.50223098 0.50916800 0.51319000 0.5395200 0.56555000 0.56935905 0.57518002 # H p.d[3,4] 0.50507200 0.51066700 0.51423900 0.5397670 0.56567500 0.56949400 0.57530403 # H p.d[4,4] 0.50507700 0.51067095 0.51424393 0.5397680 0.56567500 0.56949405 0.57530505 # H p.d[5,4] 0.50387897 0.51002200 0.51375192 0.5396430 0.56560200 0.56942900 0.57522407 # H p.d[6,4] 0.50465598 0.51040895 0.51404600 0.5397200 0.56564607 0.56945910 0.57526508 # H p.d[7,4] 0.43169000 0.43725400 0.44085300 0.4667750 0.49337507 0.49735610 0.50333502 # L p.d[1,5] 0.43169295 0.43725600 0.44085700 0.4667840 0.49343608 0.49744300 0.50350002 # L p.d[2,5] 0.50489700 0.51057200 0.51417385 0.5397510 0.56566607 0.56949005 0.57529803 # H p.d[3,5] 0.50507395 0.51066795 0.51424185 0.5397680 0.56567407 0.56949400 0.57530505 # H p.d[4,5] 0.50505597 0.51066000 0.51423693 0.5397670 0.56567407 0.56949405 0.57530505 # H p.d[5,5] 0.50507700 0.51067095 0.51424393 0.5397680 0.56567500 0.56949405 0.57530505 # H p.d[6,5] 0.43387095 0.43965395 0.44347900 0.4731010 0.54562530 0.55207000 0.56047902 # ? p.d[7,5] 0.43169000 0.43725400 0.44085300 0.4667750 0.49337507 0.49735610 0.50333502 # L p.d[1,6] 0.43213400 0.43773195 0.44137300 0.4679960 0.50205500 0.51410300 0.53880637 # ? p.d[2,6] 0.50507498 0.51066895 0.51424200 0.5397680 0.56567500 0.56949405 0.57530505 # H p.d[3,6] 0.50473997 0.51047295 0.51409593 0.5397340 0.56565400 0.56947105 0.57528302 # H p.d[4,6] 0.50485800 0.51054200 0.51415093 0.5397480 0.56566500 0.56948115 0.57529500 # H p.d[5,6] 0.43418593 0.43993600 0.44380493 0.4740860 0.54759200 0.55362900 0.56169805 # ? p.d[6,6] 0.43169498 0.43725795 0.44085793 0.4667890 0.49346908 0.49750900 0.50359002 # L p.d[7,6] 0.43169000 0.43725495 0.44085393 0.4667770 0.49338308 0.49737400 0.50336422 # L p.d[1,7] 0.43169000 0.43725400 0.44085393 0.4667760 0.49337800 0.49736310 0.50334002 # L p.d[2,7] 0.43169000 0.43725400 0.44085393 0.4667785 0.49339607 0.49738805 0.50339505 # L p.d[3,7] 0.43169000 0.43725400 0.44085393 0.4667790 0.49339707 0.49738900 0.50339700 # L p.d[4,7] 0.43169295 0.43725600 0.44085693 0.4667970 0.49352900 0.49760405 0.50377202 # L p.d[5,7] 0.43169097 0.43725600 0.44085600 0.4667950 0.49351608 0.49758705 0.50375402 # L p.d[6,7] 0.43169000 0.43725400 0.44085393 0.4667790 0.49339907 0.49738905 0.50339802 # L p.d[7,7] 0.50507700 0.51067095 0.51424393 0.5397680 0.56567500 0.56949405 0.57530505 # H p.d[1,8] 0.43169097 0.43725500 0.44085493 0.4667785 0.49340107 0.49739305 0.50340800 # L p.d[2,8] 0.50504300 0.51065295 0.51423000 0.5397650 0.56567100 0.56949205 0.57529902 # H p.d[3,8] 0.50501800 0.51064000 0.51422000 0.5397620 0.56567100 0.56949205 0.57529902 # H p.d[4,8] 0.50507695 0.51066995 0.51424200 0.5397680 0.56567500 0.56949405 0.57530505 # H p.d[5,8] 0.50507700 0.51067095 0.51424393 0.5397680 0.56567500 0.56949405 0.57530505 # H p.d[6,8] 0.46607197 0.48264175 0.49953970 0.5379310 0.56479108 0.56863805 0.57455000 # ? p.d[7,8] 0.43169000 0.43725400 0.44085300 0.4667750 0.49337600 0.49735815 0.50333602 # L ################## # End of Summary # ################## Producing 129 plots for 64 variables to file
Note I have added interpretive annotations to the right of the p.d[j, i]
lines, probably off-screen in the default view of the blog, in order to assign the 90% ROPE properly. Again, the rule is that if the ROPE contains 0.50, it’s a “?”, if it is strictly lower than 0.50, it’s an “L”, and if it’s strictly higher than 0.50, it’s a “H”.
Pingback: Considering the integer autoregressive | Hypergeometric
Wikipedia Traffic Analysis Report, with breakdown by country.