Your browser doesn't support javascript.
loading
Mixture models with a prior on the number of components.
Miller, Jeffrey W; Harrison, Matthew T.
Afiliación
  • Miller JW; Department of Biostatistics, Harvard University.
  • Harrison MT; Division of Applied Mathematics, Brown University.
J Am Stat Assoc ; 113(521): 340-356, 2018.
Article en En | MEDLINE | ID: mdl-29983475
ABSTRACT
A natural Bayesian approach for mixture models with an unknown number of components is to take the usual finite mixture model with symmetric Dirichlet weights, and put a prior on the number of components-that is, to use a mixture of finite mixtures (MFM). The most commonly-used method of inference for MFMs is reversible jump Markov chain Monte Carlo, but it can be nontrivial to design good reversible jump moves, especially in high-dimensional spaces. Meanwhile, there are samplers for Dirichlet process mixture (DPM) models that are relatively simple and are easily adapted to new applications. It turns out that, in fact, many of the essential properties of DPMs are also exhibited by MFMs-an exchangeable partition distribution, restaurant process, random measure representation, and stick-breaking representation-and crucially, the MFM analogues are simple enough that they can be used much like the corresponding DPM properties. Consequently, many of the powerful methods developed for inference in DPMs can be directly applied to MFMs as well; this simplifies the implementation of MFMs and can substantially improve mixing. We illustrate with real and simulated data, including high-dimensional gene expression data used to discriminate cancer subtypes.
Palabras clave

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: J Am Stat Assoc Año: 2018 Tipo del documento: Article

Texto completo: 1 Colección: 01-internacional Base de datos: MEDLINE Idioma: En Revista: J Am Stat Assoc Año: 2018 Tipo del documento: Article