Your browser doesn't support javascript.
loading
Adversarial meta-learning of Gamma-minimax estimators that leverage prior knowledge.
Qiu, Hongxiang; Luedtke, Alex.
Afiliação
  • Qiu H; Department of Statistics, the Wharton School, University of Pennsylvania, Philadelphia, PA, USA.
  • Luedtke A; Department of Statistics, University of Washington, Seattle, WA, USA.
Electron J Stat ; 17(2): 1996-2043, 2023.
Article em En | MEDLINE | ID: mdl-38463692
ABSTRACT
Bayes estimators are well known to provide a means to incorporate prior knowledge that can be expressed in terms of a single prior distribution. However, when this knowledge is too vague to express with a single prior, an alternative approach is needed. Gamma-minimax estimators provide such an approach. These estimators minimize the worst-case Bayes risk over a set Γ of prior distributions that are compatible with the available knowledge. Traditionally, Gamma-minimaxity is defined for parametric models. In this work, we define Gamma-minimax estimators for general models and propose adversarial meta-learning algorithms to compute them when the set of prior distributions is constrained by generalized moments. Accompanying convergence guarantees are also provided. We also introduce a neural network class that provides a rich, but finite-dimensional, class of estimators from which a Gamma-minimax estimator can be selected. We illustrate our method in two settings, namely entropy estimation and a prediction problem that arises in biodiversity studies.
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Ano de publicação: 2023 Tipo de documento: Article

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Idioma: En Ano de publicação: 2023 Tipo de documento: Article