Home > Bayesian computation > Learning about a correlation — part III

Learning about a correlation — part III

In Chapter 5, we discuss the SIR method of simulating from a posterior density g(\theta).  Like rejection sampling, we find a proposal density p(\theta) that is easy to simulate from and covers the posterior density of interest.  Here’s the algorithm:

1.  We simulate a sample \{\theta_j\}  from the proposal density p(\theta).

2.  We compute weights \{w(\theta_j) = g(\theta_j)/p(\theta_j)\}.

3.  We resample from the \{\theta_j\} with replacement with weights proportional to \{w(\theta_j)\}.

The resampled values are approximately from the distribution of interest g(\theta_j).

The function sir in the LearnBayes package implements this algorithm when the proposal density is a t density with arbitrary mean, variance and degrees of freedom.

For this example, we showed that a Normal(0.50, 0.009) density was a reasonable approximation to the posterior.  So we use a t proposal density with location 0.50, variance 0.009 and 4 degrees of freedom.  We decide to simulate 10,000 values.

The sir function’s arguments are similar to those for rejectsampling — function that defines the log posterior, parameters of the t proposal, number of simulated draws, and the data used in the log posterior function.



The output is a vector of 10,000 values.  We construct a density plot of the draws and it resembles the display in the previous posting.



Categories: Bayesian computation
  1. No comments yet.
  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: