Homepage of Hannes Nickisch

Hannes Nickisch After doing a PhD at the Max Planck Institute for Biological Cybernetics and a PostDoc at the Max Planck Institue for Intelligent Systems, I'm currently working as a Research Scientist at Philips Research, Hamburg, Germany.

During my study times at the Technical University Berlin and the Université de Nantes, I did internships with Microsoft Research in Cambridge, Siemens Corporate Research in Princeton and Siemens Medical Systems in Erlangen.

My research interests lie in probabilistic machine learning, pattern recognition, medical image segmentation and biophysical modeling.

My publications are listed below. In case of questions related to my work or if you wish to collaborate, please contact me at contact.hannes@nickisch.org.

Code

gpml

The Gaussian Process for Machine Learning toolbox.

gpml A library for Gaussian process regression and classification for Octave/Matlab containing a variety of approximate inference schemes ranging from Laplace's method over expectation propagation to variational Bayes. We furthermore support large scale approximate inference via the FITC approximation and MCMC sampling.
See the mloss.org project, the JMLR paper and the gpml page.


    cov = {@covSEiso}; sf = 1; ell = 0.4; hyp.cov = log([ell;sf]);
    lik = 'likLaplace'; sn = 0.2; hyp.lik = log(sn);
    mean = @meanZero;                   % set up GP (cov,lik,mean)
    nlZ = gp(hyp, 'infEP', mean, cov, lik, X, y);      % inference
      

glm-ie

The Generalised Linear Models Inference and Estimation Toolbox.

glm-ie A library for large scale matrix vector multiplication (MVM) based computations in generalised linear models for Octave/Matlab. We support variational Bayes, factorial mean field and expectation propagation as well as MAP estimation using a wide range of penalised least squares solvers for sparse estimation. A dedicated matrix class provides computational primitives and a wide range of regularisers are supported.
See the mloss.org project, the JMLR paper and the glm-ie page.


    X = matConv2(f, su, 'circ');        % convolution matrix
    B = matFD2(su, 'circ');       % finite difference matrix
    pot = @potLaplace; s2 = 1e-4; tau = 15;
    [m,ga,b,z,nlZ] = dli(X,y,s2,B,pot,tau,opts); % inference
    pen = @(s) penAbs(s);                       % l1-penalty
    [u,phi]  = plsTN(u0,X,y,B,opt,s2,pen);      % estimation
      

gmm

Gaussian mixture modeling with Gaussian process latent variable models and others.

dgplvm The toolbox contains code for density estimation using mixtures of Gaussians. Starting from simple kernel density estimation using spherical and diagonal Gaussian kernels over manifold Parzen window until mixtures of penalised full Gaussians with only a few components, the toolbox covers many Gaussian mixture model parametrisation from the recent literature. Most prominently, the package contains code to use the Gaussian process latent variable model for density estimation.
See the mloss.org project and the corresponding DAGM paper or get the code here.


    [lp,lpte] = dkde(z,zte);      % diagonal kernel density estimation
    [lp,lpte] = mpar(z,zte,d,k);  % manifold Parzen windows
    [lp,lpte] = pgau(z,zte);      % penalised Gaussian
      

fwtn

The fast wavelet transformation for tensor data.

fwtn The code contains a standalone light-weight implementation of the orthonormal wavelet transform using quadrature mirror filters in C including a Matlab/MEX wrapper. We fully support D-dimensional data in L levels. The algorithm has a computational complexity linear in the size of the input.
See the mloss.org project or get the code here.


    qmf = [1,1]/sqrt(2); % Haar wavelet
    L   = 3;             % # levels in the pyramid
    W   = fwtn(X,L,qmf); % apply FWT, inverse: X = fwtn(W,L,qmf,1);
      

approxXX

A variety of approximate inference methods for Gaussian process prediction.

gpml The code comprises expectation propagation, Laplace's method, the informative vector machine, Gaussian variational mean field, factorial mean field, on-line expectation propagation, TAP and variational bounding. Note that is numerically much less robust than the code in gpml. The implementations are meant to illustrate the algorithms as such and not for use as a black box system in an applied setting. The functions use the gpml v2.0 Octave/Matlab interface.
See the corresponding JMLR paper or get the code here.


    hyp = [1; 1];       % ell,sig - GP parameters
    cov = {'covSEiso'}; % covariance function
    lik = 'cumGauss';   % logistic or cumGauss likelihood
    apx = 'LA';         % EP,FV,IVM,KL,LA,LR,OLEP,SO,TAP,TAPnaive or VB
    p = binaryGP(hyp, ['approx',apx], cov, lik, x, y, xt); % prediction
      

Papers

Articles (11)

Blind Multi-Rigid Retrospective Motion Correction of MR Images
A. Loktyushin, H. Nickisch, R. Pohmann and B. Schölkopf, Magnetic Resonance in Medicine, in revision, 2014.

Attribute-Based Classification for Zero Shot Visual Object Categorization [link]
C. Lampert, H. Nickisch and S. Harmeling, IEEE Transactions on Pattern Analysis and Machine Intelligence, 36(3):453-465, 2014.

Generating Anatomical Models of the Heart and the Aorta from Medical Images for Personalized Physiological Simulations [pdf] [link]
J. Weese, A. Groth, H. Nickisch, H. Barschdorf, F.M. Weber, J. Velut, M. Castro, C. Toumoulin, J.L. Coatrieux, M. De Craene, G. Piella, C. Tobón-Gomez, A.F. Frangi, D.C. Barber, I. Valverde, Y. Shi, C. Staicu, A. Brown, P. Beerbaum and D.R. Hose, Medical and Biological Engineering and Computing,51(11):1209-1219, 2013.

Blind Retrospective Motion Correction of MR Images [pdf] [link]
A. Loktyushin, H. Nickisch, R. Pohmann and B. Schölkopf, Magnetic Resonance in Medicine, 70:1608-1618, 2013.

User-centric Learning and Evaluation of Interactive Segmentation Systems [pdf] [link]
P. Kohli, H. Nickisch, C. Rother and C. Rhemann, International Journal of Computer Vision, 100(3):261-274, 2012.

Generating Feature Spaces for Linear Algorithms with Regularized Sparse Kernel Slow Feature Analysis [pdf] [link]
W. Böhmer, S. Grünewälder, H. Nickisch, K. Obermayer, Machine Learning, 89(1):67-86, 2012.

glm-ie: The Generalised Linear Models Inference and Estimation Toolbox [pdf] [link] [web]
H. Nickisch, Journal of Machine Learning Research, 13:1699-1703, 2012.

Large Scale Bayesian Inference and Experimental Design for Sparse Linear Models [pdf] [link]
M. W. Seeger and H. Nickisch, SIAM Journal on Imaging Sciences, 4(1):166-199, 2011.

Gaussian Processes for Machine Learning (GPML) Toolbox [pdf] [link] [web]
C. E. Rasmussen and H. Nickisch, Journal of Machine Learning Research, 11:3011-3015, 2010.

Optimization of k-Space Trajectories for Compressed Sensing by Bayesian Experimental Design [pdf] [link]
M. W. Seeger, H. Nickisch, R. Pohmann and B. Schölkopf, Magnetic Resonance in Medicine, 63(1):116-126, 2010.

Approximations for Binary Gaussian Process Classification [pdf] [link]
H. Nickisch and C. E. Rasmussen, Journal of Machine Learning Research, 9:2035-2078, 2008.

 

Conference Papers (10)

From Image to Personalized Cardiac Simulation: Encoding Anatomical Structures into a Model-Based Segmentation Framework [pdf] [link] [poster]
H. Nickisch, H. Barschdorf, F. M. Weber, M. W. Krueger, O. Dössel and J. Weese, STACOM, 2012.

Additive Gaussian Processes [pdf] [link]
D. Duvenaud, H. Nickisch and C. E. Rasmussen, NIPS, 2011.

Regularized Sparse Kernel Slow Feature Analysis [pdf] [link]
W. Böhmer and S. Grünewälder, H. Nickisch and K. Obermayer, ECML/PKDD, 2011.

Fast Convergent Algorithms for Expectation Propagation Approximate Bayesian Inference [pdf] [link]
M. W. Seeger and H. Nickisch, AISTATS, 2011.

Learning an interactive segmentation system [pdf] [link]
H. Nickisch, C. Rother, C. Rhemann and Pushmeet Kohli, ICVGIP, 2010.
Best paper award.

Gaussian Mixture Modeling with Gaussian Process Latent Variable Models [pdf]
H. Nickisch and C. E. Rasmussen, DAGM, 2010.

Convex variational Bayesian inference for large scale generalized linear models [pdf] [link]
H. Nickisch and M. W. Seeger, ICML, 2009.

Learning To Detect Unseen Object Classes by Between-Class Attribute Transfer [pdf]
C. H. Lampert, H. Nickisch and S. Harmeling, CVPR, 2009.

Bayesian Experimental Design of Magnetic Resonance Imaging Sequences [pdf] [link]
M. W. Seeger, H. Nickisch, R. Pohmann and B. Schölkopf, NIPS, 2008.

Compressed Sensing and Bayesian Experimental Design [pdf] [link]
M. W. Seeger and H. Nickisch, ICML, 2008.

 

Poster (2)

Retrospective blind motion correction of MR images [pdf]
A. Loktyushin, H. Nickisch, R. Pohmann and B. Schölkopf, ISMRM, 2009.

Optimization of k-Space Trajectories by Bayesian Experimental Design [pdf]
M. W. Seeger, H. Nickisch, R. Pohmann and B. Schölkopf, ISMRM, 2009.

 

Technical reports (1)

Multiple Kernel Learning: A Unifying Probabilistic Viewpoint [pdf] [link]
H. Nickisch and M. W. Seeger, arXiv.org, 2011.

 

Theses (2)

Bayesian Inference and Experimental Design for Large Generalised Linear Models [pdf] [link]
H. Nickisch, PhD Thesis, Technische Universität Berlin, Berlin, Germany, 2010.

Extraction of visual features from natural video data using Slow Feature Analysis [pdf]
H. Nickisch, Diploma Thesis, Technische Universität Berlin, Berlin, Germany, 2006.


© 2012-2014 Hannes Nickisch

XHTML | CSS | Design by FullAhead