Shrinkage with shrunken shoulders: Gibbs sampling shrinkage model posteriors with guaranteed convergence rates.
Bayesian Anal
; 18(2): 367-390, 2023 Jun.
Article
en En
| MEDLINE
| ID: mdl-38770434
ABSTRACT
Use of continuous shrinkage priors - with a "spike" near zero and heavy-tails towards infinity - is an increasingly popular approach to induce sparsity in parameter estimates. When the parameters are only weakly identified by the likelihood, however, the posterior may end up with tails as heavy as the prior, jeopardizing robustness of inference. A natural solution is to "shrink the shoulders" of a shrinkage prior by lightening up its tails beyond a reasonable parameter range, yielding a regularized version of the prior. We develop a regularization approach which, unlike previous proposals, preserves computationally attractive structures of original shrinkage priors. We study theoretical properties of the Gibbs sampler on resulting posterior distributions, with emphasis on convergence rates of the Pólya-Gamma Gibbs sampler for sparse logistic regression. Our analysis shows that the proposed regularization leads to geometric ergodicity under a broad range of global-local shrinkage priors. Essentially, the only requirement is for the prior πlocal(â
) on the local scale λ to satisfy πlocal(0)<∞. If πlocal(â
) further satisfies limλâ0πlocal(λ)/λa<∞ for a>0, as in the case of Bayesian bridge priors, we show the sampler to be uniformly ergodic.
Texto completo:
1
Bases de datos:
MEDLINE
Idioma:
En
Revista:
Bayesian Anal
Año:
2023
Tipo del documento:
Article