## Learning from the extremes – part 2

In the last post, I described a problem with selected data. You observe speeds of 10 cards but only collect the minimum speed 52 and the maximum speed of 84. We want to learn about the mean and standard deviation of the underlying normal distribution.

We’ll work with the parameterization which will give us a better normal approximation. A standard noninformative prior is uniform on .

1. First I write a short function minmaxpost that computes the logarithm of the posterior density. The arguments to this function are and data which is a list with components n, min, and max. I’d recommend using the R functions pnorm and dnorm in computing the density — it saves typing errors.

minmaxpost=function(theta, data){ mu = theta[1] sigma = exp(theta[2]) dnorm(data$min, mu, sigma, log=TRUE) + dnorm(data$max, mu, sigma, log=TRUE) + (data$n - 2)*log(pnorm(data$max, mu, sigma)-pnorm(data$min, mu, sigma)) }

2. Then I use the function laplace in the LearnBayes package to summarize this posterior. The arguments to laplace are the name of the log posterior function, an initial estimate at and the data that is used in the log posterior function.

data=list(n=10, min=52, max=84) library(LearnBayes) fit = laplace(minmaxpost, c(70, 2), data)

3. The output of laplace includes mode, the posterior mode, and var, the corresponding estimate at the variance-covariance matrix.

fit $mode [1] 67.999960 2.298369 $var [,1] [,2] [1,] 1.920690e+01 -1.900688e-06 [2,] -1.900688e-06 6.031533e-02

4. I demonstrate below that we obtain a pretty good approximation in this situation. I use the mycontour function to display contours of the exact posterior and overlay the matching normal approximation using a second application of mycontour.

mycontour(minmaxpost, c(45, 95, 1.5, 4), data, xlab=expression(mu), ylab=expression(paste("log ",sigma))) mycontour(lbinorm, c(45, 95, 1.5, 4), list(m=fit$mode, v=fit$var), add=TRUE, col="red")