Bayesian methods are often optimal, yet increasing pressure for fast computations, especially with streaming data, brings renewed interest in faster, possibly sub-optimal, solutions. The extent to which these algorithms approximate Bayesian solutions is a question of interest, but often unanswered. We propose a methodology to address this question in predictive settings, when the algorithm can be reinterpreted as a probabilistic predictive rule. We specifically develop the proposed methodology for a recursive procedure for online learning in nonparametric mixture models, often refereed to as Newton’s algorithm. This algorithm is simple and fast; however, its approximation properties are unclear. By reinterpreting it as a predictive rule, we can show that it underlies a statistical model which is, asymptotically, a Bayesian, exchangeable mixture model. In this sense, the recursive rule provides a quasi-Bayes solution. While the algorithm only offers a point estimate, our clean statistical formulation allows us to provide the asymptotic posterior distribution and asymptotic credible intervals for the mixing distribution. Moreover, it gives insights for tuning the parameters, as we illustrate in simulation studies, and paves the way to extensions in various directions. Beyond mixture models, our approach can be applied to other predictive algorithms.

Quasi‐Bayes properties of a procedure for sequential learning in mixture models

Fortini Sandra;Petrone Sonia
2020

Abstract

Bayesian methods are often optimal, yet increasing pressure for fast computations, especially with streaming data, brings renewed interest in faster, possibly sub-optimal, solutions. The extent to which these algorithms approximate Bayesian solutions is a question of interest, but often unanswered. We propose a methodology to address this question in predictive settings, when the algorithm can be reinterpreted as a probabilistic predictive rule. We specifically develop the proposed methodology for a recursive procedure for online learning in nonparametric mixture models, often refereed to as Newton’s algorithm. This algorithm is simple and fast; however, its approximation properties are unclear. By reinterpreting it as a predictive rule, we can show that it underlies a statistical model which is, asymptotically, a Bayesian, exchangeable mixture model. In this sense, the recursive rule provides a quasi-Bayes solution. While the algorithm only offers a point estimate, our clean statistical formulation allows us to provide the asymptotic posterior distribution and asymptotic credible intervals for the mixing distribution. Moreover, it gives insights for tuning the parameters, as we illustrate in simulation studies, and paves the way to extensions in various directions. Beyond mixture models, our approach can be applied to other predictive algorithms.
2020
2020
Fortini, Sandra; Petrone, Sonia
File in questo prodotto:
File Dimensione Formato  
Accettazione.pdf

non disponibili

Descrizione: Lettera di accettazione
Tipologia: Allegato per valutazione Bocconi (Attachment for Bocconi evaluation)
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 106.95 kB
Formato Adobe PDF
106.95 kB Adobe PDF   Visualizza/Apri
1902.10708.pdf

non disponibili

Descrizione: Articolo
Tipologia: Documento in Pre-print (Pre-print document)
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 1.54 MB
Formato Adobe PDF
1.54 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11565/4026855
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 9
  • ???jsp.display-item.citation.isi??? 8
social impact