DiscoverStatLearn 2010 - Workshop on "Challenging problems in Statistical Learning"3.3 Importance sampling methods for Bayesian discrimination between embedded models (Jean-Michel Marin)
3.3 Importance sampling methods for Bayesian discrimination between embedded models (Jean-Michel Marin)

3.3 Importance sampling methods for Bayesian discrimination between embedded models (Jean-Michel Marin)

Update: 2014-12-04
Share

Description

We survey some approaches on the approximation of Bayes factors used in Bayesian model choice and propose a new one. Our focus here is on methods that are based on importance sampling strategies, rather than variable dimension techniques like reversible jump MCMC, including : crude Monte Carlo, MLE based importance sampling, bridge and harmonic mean sampling, Chib?s method based on the exploitation of a functional equality, as well as a revisited Savage-Dickey?s approximation. We demonstrate in this survey how all these methods can be efficiently implemented for testing the significance of a predictive variable in a probit model. Finally, we compare their performances on a real dataset. This is a joint work with Christian P. Robert.
Comments 
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

3.3 Importance sampling methods for Bayesian discrimination between embedded models (Jean-Michel Marin)

3.3 Importance sampling methods for Bayesian discrimination between embedded models (Jean-Michel Marin)

UP1 Service TICE