Bayesian nonparametric mixture of experts for inverse problems - STATIFY
Pré-Publication, Document De Travail Année : 2023

Bayesian nonparametric mixture of experts for inverse problems

Résumé

Large classes of problems can be formulated as inverse problems, where the goal is to find parameter values that best explain some observed measures. The relationship between parameters and observations is typically highly non-linear, with relatively high dimensional observations and correlated multidimensional parameters.} To deal with these constraints via inverse regression strategies, we consider the Gaussian Local Linear Mapping (GLLiM) model, a special instance of mixture of expert models. We propose a general scheme to design a Bayesian nonparametric GLLiM model to avoid any commitment to an arbitrary number of experts. A tractable estimation algorithm is designed using variational Bayesian expectation-maximisation. We establish posterior consistency for the number of mixture components after the merge-truncate-merge algorithm post-processing. Illustrations on simulated data show good results in terms of recovering the true number of experts and the regression function.
Fichier principal
Vignette du fichier
240617_BNP_GLLiM.pdf (1.18 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04015203 , version 1 (05-03-2023)
hal-04015203 , version 2 (01-11-2023)
hal-04015203 , version 3 (17-06-2024)

Identifiants

  • HAL Id : hal-04015203 , version 3

Citer

Trungtin Nguyen, Florence Forbes, Julyan Arbel, Hien Duy Nguyen. Bayesian nonparametric mixture of experts for inverse problems. 2023. ⟨hal-04015203v3⟩
474 Consultations
323 Téléchargements

Partager

More