Now that we have formed our prior beliefs, we’ll summarize the posterior by computing the density on a fine grid of points. The functions mycontour and simcontour in the LearnBayes package are helpful here.
After loading LearnBayes, the logarithm of the logistic posterior is programmed in the function logisticpost. There are two arguments to this posterior, the vector theta of regression coefficients and the data which is a matrix of the form [s n x], where s is the vector of successes, n is the vector of sample sizes, and x is the vector of covariates. When we use the conditional means prior (described in the previous post), the prior is in that form, so we simply augment the data with the prior “data” and that is the input for logisticpost.
Here is the matrix prior that uses the parameters described in the previous post.
prior=rbind(c(30, 8.25 + 1.19, 8.25), c(50, 4.95 + 4.95, 4.95))
We paste the data and prior together in the matrix data.prior.
After some trial and error, we find that the rectangle
(-2, 12, -0.3, 0.05) brackets the posterior. We draw a contour plot of the posterior using the mycontour function — the arguments are the function defining the log posterior, the vector of limits (xlo, xhi, ylo, yhi) and the data.prior matrix.
mycontour(logisticpost, c(-2, 12, -0.3, 0.05), data.prior, xlab=expression(beta), ylab=expression(beta))
Now that we have bracketed the posterior, we use the simcontour function to take a simulated sample from the posterior.
S=simcontour(logisticpost, c(-2, 12, -.3, .05), data.prior, 1000)
I place the points on the contour to demonstrate that this seems to be a reasonable sample.
Let’s illustrate doing some inference. Suppose I’m interested in p(40), the probability of success at 40 yards. This is easy to do using the simulated sample. I compute values of p(40) from values of the simulated draws of
I’m pretty confident that the success rate from this distance is between 0.6 and 0.8.