Bayesian Thinking

Learning about an exponential rate and location

Advertisements

To illustrate MCMC fitting for a two parameter problem, suppose one observes a sample from a gamma density with location and rate :

If we assume a uniform prior on , then the posterior density is given by

,

where .

Suppose we observe the following data:

[1]  5.1  5.3  7.1  6.5  3.2  5.2  6.4  9.2 11.4  3.1  3.0  9.4  3.6  5.7  5.0
[16]  3.6  3.1 11.5  3.5  5.5

5.1  5.3  7.1  6.5  3.2  5.2  6.4  9.2 11.4  3.1  3.0  9.4  3.6  5.7  5.0
3.6  3.1 11.5  3.5  5.5

1.  First we write a function logpost that computes the log posterior of .  Since , we have to be careful when negative values are input for .  In the ifelse function, we assign a very large negative value when .

logpost=function(theta,y)
ifelse(theta[2]>0,sum(dexp(y-theta[1],theta[2],log=TRUE)),-9e200)

2.  Using the mycontour function in the LearnBayes package, we construct a contour plot of the posterior — we add the line to show the boundary for .

mycontour(logpost,c(1.5,3.5,.1,.8),y,xlab=”MU”,ylab=”BETA”,col=”red”)
abline(v=min(y))

3.   We’ll try using a random walk Metropolis algorithm where a candidate value is found by
,
where is normal with mean 0 and variance-covariance matrix .  Here it is not quite obvious how to choose and — a normal approximation won’t work here.  Looking at the graph, it seems that most of the probability for is in (2, 3) and most of the probability for is in (0.1, 0.6).  If we use the rough rule of thumb “4 sd = range of data”, I get estimates for the two standard deviations and these can be squared to get estimates at the variances.  I obtain the estimate = diag(.06, .015).  After some trial and error, it seems that the value scale = 1 works ok — the acceptance rate is 25%.
Here are the arguments for the function rwmetrop — I show the simulated draws on top of the scatterplot.  It looks like a reasonable sample from the posterior.
fit=rwmetrop(logpost,list(var=diag(c(.06,.015)),scale=1),c(2,.3),10000,y)
Advertisements