About

Log in?

DTU users get better search results including licensed content and discounts on order fees.

Anyone can log in and get personalized features such as favorites, tags and feeds.

Log in as DTU user Log in as non-DTU user No thanks

DTU Findit

Conference paper

Incremental Gaussian Processes

In Advances in Neural Processing Systems — 2002
From

Cognitive Systems, Department of Informatics and Mathematical Modeling, Technical University of Denmark1

Department of Informatics and Mathematical Modeling, Technical University of Denmark2

In this paper, we consider Tipping's relevance vector machine (RVM) and formalize an incremental training strategy as a variant of the expectation-maximization (EM) algorithm that we call subspace EM. Working with a subset of active basis functions, the sparsity of the RVM solution will ensure that the number of basis functions and thereby the computational complexity is kept low.

We also introduce a mean field approach to the intractable classification model that is expected to give a very good approximation to exact Bayesian inference and contains the Laplace approximation as a special case. We test the algorithms on two large data sets with 10\^3-10\^4 examples. The results indicate that Bayesian learning of large data sets, e.g. the MNIST database is realistic.

Language: English
Year: 2002
Types: Conference paper
ORCIDs: Winther, Ole

DTU users get better search results including licensed content and discounts on order fees.

Log in as DTU user

Access

Analysis