Bayesian Thinking

Learning about exponential parameters — part III

Advertisements

The two-parameter exponential sampling problem can be used to illustrate Gibbs sampling.  The  joint posterior density for has the form

Note that each of the two conditional posteriors have simple forms.

1.  If we fix a value of , the conditional posterior of has the gamma form with shape and rate .

2.  Turning things around, if we fix a value of , the conditional posterior of has the form

.

This is the mirror image of an exponential random variable with rate beta and location .  By use of the inversion method, one can simulate a value of .

Here is a short function to implement Gibbs sampling for this example.  The inputs are the data vector y and the number of cycles of Gibbs sampling iter.   One cycle of GS is accomplished by the two lines

beta=rgamma(1,shape=n+1,rate=s-n*mu)  # simulates from []
mu=log(runif(1))/(beta*n)+min(y)  # simulates from []

gibbs.exp=function(y,iter)
{
n=length(y)
s=sum(y)
theta=array(0,c(iter,2))
mu=min(y)
for (j in 1:iter)
{
beta=rgamma(1,shape=n+1,rate=s-n*mu)
mu=log(runif(1))/(beta*n)+min(y)
theta[j,]=c(mu,beta)
}

gibbs.exp=function(y,iter)
{
n=length(y)
s=sum(y)
theta=array(0,c(iter,2))
mu=min(y)
for (j in 1:iter)
{
beta=rgamma(1,shape=n+1,rate=s-n*mu)
mu=log(runif(1))/(beta*n)+min(y)
theta[j,]=c(mu,beta)
}

I ran this for 10,000 iterations and the MCMC chain appeared to mix well.
Advertisements

Advertisements