Home > Bayesian computation > Normal approximation to posterior

Normal approximation to posterior

To illustrate a normal approximation to a posterior, let’s return to the fire call example described in the September 6 post.  Here we had Poisson sampling with mean \lambda and a normal prior on \lambda.

1.  First we write a short function lpost that computes the logarithm of the posterior.  I show the expressions for the log likelihood and the log prior.  On the log scale, the log posterior is the log likelihood PLUS the log prior.

lpost = function(lambda){
  loglike = -6*lambda + 8*log(lambda)
  logprior = dnorm(lambda, 3, 1, log=TRUE)
  loglike + logprior
}

2.  I plot the normalized version of the posterior below.  I first write a short function post that computes the posterior, use the integrate function to numerically integrate the density from 0 to 10, and then use the curve function to display the normalized posterior.

post = function(lambda) exp(lpost(lambda))
C = integrate(post, 0, 10)
curve(exp(lpost(x))/C$value, 0, 5)

3.  There is a useful function laplace in the LearnBayes package that conveniently finds the matching normal approximation to the posterior.  The function inputs are (1) the function defining the log posterior, and (2) a guess at the posterior mode.  Typically the log posterior might depend on data and prior parameters and that would be the last input to laplace (here we are not using that extra input).

library(LearnBayes)
fit=laplace(lpost, 1)
fit
$mode
[1] 1.7
$var
          [,1]
[1,] 0.2653809

The important output here is (1) the mode of the posterior and (2) the corresponding approximation to the posterior variance.  By looking at the output, we see that the posterior of \lambda is approximately N(1.7, \sqrt{0.265}).

4.  To check the accuracy of this approximation, we use the curve function to add a matching normal density.  (I have also added a legend by use of the legend function.)  Note that the normal approximation is pretty accurate in this particular case.

curve(dnorm(x, fit$mode, sqrt(fit$var)), add=TRUE, col="red")
legend("topright", legend=c("Exact", "Normal Approximation"),
       lty = 1, col=c("black", "red"))

 

Advertisements
Categories: Bayesian computation
  1. No comments yet.
  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: