whiskerplot.Rd
Displays whisker plots for specified parameters on the same plot, with a point at the mean value for the posterior distribution and whiskers extending to the specified quantiles of the distribution.
whiskerplot(x, parameters, quantiles=c(0.025,0.975), zeroline=TRUE, ...)
A jagsUI object
A vector of names (as characters) of parameters to include in the plot. Parameter names must match parameters included in the model. Calling non-scalar parameters without subsetting (e.g. alpha
) will plot all values of alpha
.
A vector with two values specifying the quantile values (lower and upper).
If TRUE, a horizontal line at zero is drawn on the plot.
Additional arguments passed to plot.default
#Analyze Longley economic data in JAGS
#Number employed as a function of GNP
#See ?jags for a more detailed example
#Get data
data(longley)
gnp <- longley$GNP
employed <- longley$Employed
n <- length(employed)
data <- list(gnp=gnp,employed=employed,n=n)
#Identify filepath of model file
modfile <- tempfile()
writeLines("
model{
#Likelihood
for (i in 1:n){
employed[i] ~ dnorm(mu[i], tau)
mu[i] <- alpha + beta*gnp[i]
}
#Priors
alpha ~ dnorm(0, 0.00001)
beta ~ dnorm(0, 0.00001)
sigma ~ dunif(0,1000)
tau <- pow(sigma,-2)
}
", con=modfile)
#Set parameters to monitor
params <- c('alpha','beta','sigma','mu')
#Run analysis
out <- jags(data = data,
inits = NULL,
parameters.to.save = params,
model.file = modfile,
n.chains = 3,
n.adapt = 100,
n.iter = 1000,
n.burnin = 500,
n.thin = 2)
#>
#> Processing function input.......
#>
#> Done.
#>
#> Compiling model graph
#> Resolving undeclared variables
#> Allocating nodes
#> Graph information:
#> Observed stochastic nodes: 16
#> Unobserved stochastic nodes: 3
#> Total graph size: 74
#>
#> Initializing model
#>
#> Adaptive phase, 100 iterations x 3 chains
#> If no progress bar appears JAGS has decided not to adapt
#>
#>
|
| | 0%
|
|+ | 2%
|
|++ | 4%
|
|+++ | 6%
|
|++++ | 8%
|
|+++++ | 10%
|
|++++++ | 12%
|
|+++++++ | 14%
|
|++++++++ | 16%
|
|+++++++++ | 18%
|
|++++++++++ | 20%
|
|+++++++++++ | 22%
|
|++++++++++++ | 24%
|
|+++++++++++++ | 26%
|
|++++++++++++++ | 28%
|
|+++++++++++++++ | 30%
|
|++++++++++++++++ | 32%
|
|+++++++++++++++++ | 34%
|
|++++++++++++++++++ | 36%
|
|+++++++++++++++++++ | 38%
|
|++++++++++++++++++++ | 40%
|
|+++++++++++++++++++++ | 42%
|
|++++++++++++++++++++++ | 44%
|
|+++++++++++++++++++++++ | 46%
|
|++++++++++++++++++++++++ | 48%
|
|+++++++++++++++++++++++++ | 50%
|
|++++++++++++++++++++++++++ | 52%
|
|+++++++++++++++++++++++++++ | 54%
|
|++++++++++++++++++++++++++++ | 56%
|
|+++++++++++++++++++++++++++++ | 58%
|
|++++++++++++++++++++++++++++++ | 60%
|
|+++++++++++++++++++++++++++++++ | 62%
|
|++++++++++++++++++++++++++++++++ | 64%
|
|+++++++++++++++++++++++++++++++++ | 66%
|
|++++++++++++++++++++++++++++++++++ | 68%
|
|+++++++++++++++++++++++++++++++++++ | 70%
|
|++++++++++++++++++++++++++++++++++++ | 72%
|
|+++++++++++++++++++++++++++++++++++++ | 74%
|
|++++++++++++++++++++++++++++++++++++++ | 76%
|
|+++++++++++++++++++++++++++++++++++++++ | 78%
|
|++++++++++++++++++++++++++++++++++++++++ | 80%
|
|+++++++++++++++++++++++++++++++++++++++++ | 82%
|
|++++++++++++++++++++++++++++++++++++++++++ | 84%
|
|+++++++++++++++++++++++++++++++++++++++++++ | 86%
|
|++++++++++++++++++++++++++++++++++++++++++++ | 88%
|
|+++++++++++++++++++++++++++++++++++++++++++++ | 90%
|
|++++++++++++++++++++++++++++++++++++++++++++++ | 92%
|
|+++++++++++++++++++++++++++++++++++++++++++++++ | 94%
|
|++++++++++++++++++++++++++++++++++++++++++++++++ | 96%
|
|+++++++++++++++++++++++++++++++++++++++++++++++++ | 98%
|
|++++++++++++++++++++++++++++++++++++++++++++++++++| 100%
#>
#> Burn-in phase, 500 iterations x 3 chains
#>
#>
|
| | 0%
|
|* | 2%
|
|** | 4%
|
|*** | 6%
|
|**** | 8%
|
|***** | 10%
|
|****** | 12%
|
|******* | 14%
|
|******** | 16%
|
|********* | 18%
|
|********** | 20%
|
|*********** | 22%
|
|************ | 24%
|
|************* | 26%
|
|************** | 28%
|
|*************** | 30%
|
|**************** | 32%
|
|***************** | 34%
|
|****************** | 36%
|
|******************* | 38%
|
|******************** | 40%
|
|********************* | 42%
|
|********************** | 44%
|
|*********************** | 46%
|
|************************ | 48%
|
|************************* | 50%
|
|************************** | 52%
|
|*************************** | 54%
|
|**************************** | 56%
|
|***************************** | 58%
|
|****************************** | 60%
|
|******************************* | 62%
|
|******************************** | 64%
|
|********************************* | 66%
|
|********************************** | 68%
|
|*********************************** | 70%
|
|************************************ | 72%
|
|************************************* | 74%
|
|************************************** | 76%
|
|*************************************** | 78%
|
|**************************************** | 80%
|
|***************************************** | 82%
|
|****************************************** | 84%
|
|******************************************* | 86%
|
|******************************************** | 88%
|
|********************************************* | 90%
|
|********************************************** | 92%
|
|*********************************************** | 94%
|
|************************************************ | 96%
|
|************************************************* | 98%
|
|**************************************************| 100%
#>
#> Sampling from joint posterior, 500 iterations x 3 chains
#>
#>
|
| | 0%
|
|* | 2%
|
|** | 4%
|
|*** | 6%
|
|**** | 8%
|
|***** | 10%
|
|****** | 12%
|
|******* | 14%
|
|******** | 16%
|
|********* | 18%
|
|********** | 20%
|
|*********** | 22%
|
|************ | 24%
|
|************* | 26%
|
|************** | 28%
|
|*************** | 30%
|
|**************** | 32%
|
|***************** | 34%
|
|****************** | 36%
|
|******************* | 38%
|
|******************** | 40%
|
|********************* | 42%
|
|********************** | 44%
|
|*********************** | 46%
|
|************************ | 48%
|
|************************* | 50%
|
|************************** | 52%
|
|*************************** | 54%
|
|**************************** | 56%
|
|***************************** | 58%
|
|****************************** | 60%
|
|******************************* | 62%
|
|******************************** | 64%
|
|********************************* | 66%
|
|********************************** | 68%
|
|*********************************** | 70%
|
|************************************ | 72%
|
|************************************* | 74%
|
|************************************** | 76%
|
|*************************************** | 78%
|
|**************************************** | 80%
|
|***************************************** | 82%
|
|****************************************** | 84%
|
|******************************************* | 86%
|
|******************************************** | 88%
|
|********************************************* | 90%
|
|********************************************** | 92%
|
|*********************************************** | 94%
|
|************************************************ | 96%
|
|************************************************* | 98%
|
|**************************************************| 100%
#>
#> Calculating statistics.......
#>
#> Done.
#Examine output summary
out
#> JAGS output for model '/tmp/Rtmp848V9S/filefdd3798293a3', generated by jagsUI.
#> Estimates based on 3 chains of 1000 iterations,
#> adaptation = 100 iterations (sufficient),
#> burn-in = 500 iterations and thin rate = 2,
#> yielding 750 total samples from the joint posterior.
#> MCMC ran for 0.001 minutes at time 2024-09-13 14:59:24.610012.
#>
#> mean sd 2.5% 50% 97.5% overlap0 f Rhat n.eff
#> alpha 51.829 0.730 50.441 51.845 53.379 FALSE 1 0.999 750
#> beta 0.035 0.002 0.031 0.035 0.038 FALSE 1 1.000 750
#> sigma 0.717 0.144 0.505 0.700 1.043 FALSE 1 1.000 750
#> mu[1] 59.972 0.335 59.311 59.973 60.658 FALSE 1 0.999 750
#> mu[2] 60.846 0.298 60.248 60.848 61.462 FALSE 1 0.999 750
#> mu[3] 60.798 0.300 60.197 60.799 61.418 FALSE 1 0.999 750
#> mu[4] 61.721 0.265 61.190 61.723 62.278 FALSE 1 0.999 750
#> mu[5] 63.263 0.217 62.827 63.262 63.689 FALSE 1 0.999 750
#> mu[6] 63.890 0.204 63.494 63.888 64.288 FALSE 1 0.999 750
#> mu[7] 64.529 0.196 64.143 64.525 64.918 FALSE 1 1.000 750
#> mu[8] 64.450 0.197 64.062 64.446 64.840 FALSE 1 1.000 750
#> mu[9] 65.644 0.195 65.263 65.642 66.044 FALSE 1 1.000 750
#> mu[10] 66.399 0.204 65.981 66.399 66.813 FALSE 1 1.001 750
#> mu[11] 67.218 0.222 66.785 67.218 67.656 FALSE 1 1.001 750
#> mu[12] 67.280 0.224 66.844 67.281 67.722 FALSE 1 1.001 750
#> mu[13] 68.607 0.266 68.088 68.605 69.142 FALSE 1 1.001 750
#> mu[14] 69.298 0.292 68.735 69.298 69.878 FALSE 1 1.001 750
#> mu[15] 69.839 0.314 69.233 69.841 70.464 FALSE 1 1.001 750
#> mu[16] 71.116 0.370 70.377 71.108 71.840 FALSE 1 1.001 750
#> deviance 33.296 2.879 30.173 32.466 40.443 FALSE 1 1.000 750
#>
#> Successful convergence based on Rhat values (all < 1.1).
#> Rhat is the potential scale reduction factor (at convergence, Rhat=1).
#> For each parameter, n.eff is a crude measure of effective sample size.
#>
#> overlap0 checks if 0 falls in the parameter's 95% credible interval.
#> f is the proportion of the posterior with the same sign as the mean;
#> i.e., our confidence that the parameter is positive or negative.
#>
#> DIC info: (pD = var(deviance)/2)
#> pD = 4.1 and DIC = 37.446
#> DIC is an estimate of expected predictive error (lower is better).
#Generate whisker plots
#Plot alpha
whiskerplot(out,parameters=c('alpha'))
#Plot all values of mu
whiskerplot(out,parameters='mu')
#Plot a subset of mu
whiskerplot(out,parameters=c('mu[1]','mu[7]'))
#Plot mu and alpha together
whiskerplot(out,parameters=c('mu','alpha'))