Subspace-Induced Gaussian Processes.
We present a new Gaussian process (GP) regression model where the covariancekernel is indexed or parameterized by a sufficient dimension reduction subspaceof a reproducing kernel Hilbert space. The covariance kernel will be low-rankwhile capturing the statistical dependency of the response to the covariates,this affords significant improvement in computational efficiency as well aspotential reduction in the variance of predictions. We develop a fastExpectation-Maximization algorithm for estimating the parameters of thesubspace-induced Gaussian process (SIGP). Extensive results on real data showthat SIGP can outperform the standard full GP even with a low rank-$m$, $m\leq3$, inducing subspace.
Stay in the loop.
Subscribe to our newsletter for a weekly update on the latest podcast, news, events, and jobs postings.