Mixture of g-priors for Generalized Linear Models
by Merlise Clyde
A new revision of the paper with Yingbo Li on Mixtures of g-priors in Generalized Linear Models is now available
Li, Yingbo and Clyde, Merlise A. 2016. Mixtures of g-priors in Generalized Linear Models
Abstract
Mixtures of Zellner’s -priors have been studied extensively in
linear models and have been shown to have numerous desirable
properties for Bayesian variable selection and model averaging.
Several extensions of -priors to Generalized Linear Models (GLMs)
have been proposed in the literature; however, the choice of prior
distribution of and resulting properties for inference have
received considerably less attention. In this paper, we unify
mixtures of -priors in GLMs by assigning the truncated Compound
Confluent Hypergeometric (tCCH) distribution to
, which encompasses as special cases several mixtures of
-priors in the literature, such as the hyper-, Beta-prime,
truncated Gamma, incomplete inverse-Gamma, benchmark, robust,
hyper-, and intrinsic priors. Through an integrated Laplace
approximation, the posterior distribution of is in turn a
tCCH distribution, and approximate marginal likelihoods are thus
available analytically, leading to “Compound Hypergeometric
Information Criteria” for model selection. We discuss the local
geometric properties of the -prior in GLMs and show how the
desiderata for model selection proposed by Bayarri et al, such as
asymptotic model selection consistency, intrinsic consistency, and
measurement invariance may be used to justify the prior and specific
choices of the hyper parameters. We illustrate inference using
these priors and contrast them to other approaches via
simulation and real data examples. The methodology
is implemented in the R
package BAS
and freely
available on CRAN
.
Subscribe via RSS