Here is another illustration of Bayes factors. We are worried about the possibility of outliers and so we assume the observations are distributed from a t distribution with location , scale , and degrees of freedom . We have some prior beliefs about the location and scale: we assume independent with and .
We observe the following data:
We define a model
that defines the degrees of freedom parameter to be a specific value.
We can use Bayes factors to decide on appropriate values of
I define a function tlikepost defining the log posterior of
. Here the argument stuff is a list with two components, the data vector y and the degrees of freedom value df. Here the sampling density of
is given by
loglike=sum(dt((y-mu)/sigma, df, log=TRUE)-log(sigma))
By using the function laplace with the “int” component, we can approximate the log of the marginal density of
. To illustrate, suppose we wish to compute
for a model that assumes that we have 30 degrees of freedom.
So for the model
, we have log f(y) = -89.23. To compare this model with, say the model
, we repeat this using
, compare the log marginal density, and then we can compare the two models by a Bayes factor.
Here are some things to try.
1. With this data set, try comparing different values for the degrees of freedom parameter. What you should find is that large
values are supported — this indicates that a normal sampling model is appropriate.
2. Now change the dataset by introducing an outlier and repeat the analysis, comparing different
models. What you should find that a t model with small degrees of freedom is supported with this new dataset.