We are currently developing software using the Theano Python libary that allows GPU processing.
The following software packages of the research group are available as free software under the GNU General Public License.
Ladder network related to the paper:
A. Rasmus, H. Valpola, M. Honkala, M. Berglund, and T. Raiko.
Semi-Supervised Learning with Ladder Network.
To appear in NIPS 2015.
Preprint available as arXiv:1507.02672 [cs.NE], July 2015.
Code related to the paper:
Tapani Raiko, Li Yao, KyungHyun Cho, Yoshua Bengio
Iterative Neural Autoregressive Distribution Estimator (NADE-k).
Advances in Neural Information Processing Systems (NIPS), 2014.
Our Matlab code for Boltzmann machines and autoencoders can be found on KyungHyun Cho's homepage.
BayesPy provides tools for variational Bayesian inference in Python. It supports conjugate exponential family models. Inference algorithms include variational message passing, Riemannian conjugate gradient learning and stochastic variational inference. The latest release can be installed from PyPI, and the development version from GitHub. Online documentation is available at bayespy.org.
The Matlab toolbox contains variants of probabilistic models for principal component analysis (PCA) in the presence of missing values. We also implemented unregularized approaches including least-squares and imputation algorithm. Some of the implemented algorithms work for huge datasets such as the Netflix data. The core computations are implemented in C++ with the possibility to use multiple processors. The current version is available here.
The library is a C++/Python implementation of the variational building block framework introduced in our papers. The framework allows easy learning of a wide variety of models using variational Bayesian learning. The source code of the base package can be downloaded as a gzipped tar file or a zip file.
A Windows installer of the Python package of Bayes Blocks 1.1.1 is available. In order to use this package, you need to install Python 2.5(.x) and NumPy.
The current version of the package is 1.1.1, released January 3, 2007. A full list of changes is also available.
Bayes Blocks is now hosted at PASCAL Forge, where you can find the latest development code in the Subversion repository as well as submit bug reports and patches.
We will provide extension packages to the basic library containing the code used to run the simulations in our papers.
These packages have been updated to work with Bayes Blocks 1.x.
The Matlab codes for our nonlinear dynamical factor analysis (nonlinear state-space model) algorithm can be found here (gzipped tar file). The current version is 1.1, released Feb 12, 2010.
Here is a package to run the simulations with the Lorenz-data reported in Valpola and Karhunen (2002).
Here are the additional files and data for the cart-pole control simulations reported in Raiko and Tornio (2005) to be used with NDFA 1.0 or later.
The new Matlab codes for our nonlinear factor analysis algorithm can be found here (gzipped tar file). The current version is 2.0, released Feb 12, 2010. This version includes speedups over the previous release, but because of change of internal structure is not directly compatible.
The older Matlab codes for our nonlinear factor analysis algorithm can be found here (gzipped tar file) or here (zip file). The last version is 1.0, released Jul 23, 2004. A list of recent changes is available here. The package contains example code for a sample problem of 3D helix also seen in our paper for the ICA 2000. There is also a version that supports missing values based on v 0.9.
The variational Bayesian mixture of Gaussians Matlab package (zip file) was released on Mar 16, 2010. It includes several methods for learning, including the natural conjugate gradient algorithm.
Main entrance of the Computer Science and Engineering house where our department is located.