License: Creative Commons Attribution 4.0 International license (CC BY 4.0)
When quoting this document, please refer to the following
DOI: 10.4230/DagSemProc.09181.4
URN: urn:nbn:de:0030-drops-21148
URL: http://dagstuhl.sunsite.rwth-aachen.de/volltexte/2009/2114/
Go to the corresponding Portal |
Seeger, Matthias ;
Nickisch, Hannes
Large Scale Variational Inference and Experimental Design for Sparse Generalized Linear Models
Abstract
Sparsity is a fundamental concept of modern statistics, and often the only general principle available at the moment to address novel learning applications with many more variables than observations. While much progress has been made recently in the theoretical understanding and algorithmics of sparse point estimation, higher-order problems such as covariance estimation or optimal data acquisition are seldomly addressed for sparsity-favouring models, and there are virtually no algorithms for large scale applications of these. We provide novel approximate Bayesian inference algorithms for sparse generalized linear models, that can be used with hundred thousands of variables, and run orders of magnitude faster than previous algorithms in domains where either apply. By analyzing our methods and establishing some novel convexity results, we settle a long-standing open question about variational Bayesian inference for continuous variable models: the Gaussian lower bound relaxation, which has been used previously for a range of models, is proved to be a convex optimization problem, if and only if the posterior mode is found by convex programming. Our algorithms reduce to the same computational primitives than commonly used sparse estimation methods do, but require Gaussian marginal variance estimation as well. We show how the Lanczos algorithm from numerical mathematics can be employed to compute the latter.
We are interested in Bayesian experimental design here (which is mainly driven by efficient approximate inference), a powerful framework for optimizing measurement architectures of complex signals, such as natural images. Designs optimized by our Bayesian framework strongly outperform choices advocated by compressed sensing theory, and with our novel algorithms, we can scale it up to full-size images. Immediate applications of our method lie in digital photography and medical imaging.
We have applied our framework to problems of magnetic resonance imaging design and reconstruction, and part of this work appeared at a conference (Seeger et al., 2008). The present paper describes our methods in much greater generality, and most of the theory is novel. Experiments and evaluations will be given in a later paper.
BibTeX - Entry
@InProceedings{seeger_et_al:DagSemProc.09181.4,
author = {Seeger, Matthias and Nickisch, Hannes},
title = {{Large Scale Variational Inference and Experimental Design for Sparse Generalized Linear Models}},
booktitle = {Sampling-based Optimization in the Presence of Uncertainty},
series = {Dagstuhl Seminar Proceedings (DagSemProc)},
ISSN = {1862-4405},
year = {2009},
volume = {9181},
editor = {J\"{u}rgen Branke and Barry L. Nelson and Warren Buckler Powell and Thomas J. Santner},
publisher = {Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
address = {Dagstuhl, Germany},
URL = {https://drops.dagstuhl.de/opus/volltexte/2009/2114},
URN = {urn:nbn:de:0030-drops-21148},
doi = {10.4230/DagSemProc.09181.4},
annote = {Keywords: Bayesian experimental design, variational inference, sparse estimation}
}
Keywords: |
|
Bayesian experimental design, variational inference, sparse estimation |
Collection: |
|
09181 - Sampling-based Optimization in the Presence of Uncertainty |
Issue Date: |
|
2009 |
Date of publication: |
|
30.07.2009 |