Top Menu


Machine Learning Group

Image Section


Department of Computer Science

University of Copenhagen


Sigurdsgade 41

2200 KĂžbenhavn N



Office: 1.08


EMail: This email address is being protected from spambots. You need JavaScript enabled to view it.



Exact Unsupervised Regularized Least-Squares Classification

An exact approach to the unsupervised regularized least-squares classification problem, implemented in Python using the Numpy and the Scipy packages. The learning problem considered is a variant of the maximum margin clustering, task which can be regarded as the direct extension of support vector machines to unsupervised learning scenarios. The goal is to partition unlabeled data into two classes such that a subsequent application of a support vector machine would yield the overall best result (with respect to the optimization problem associated with support vector machines).



While being very appealing from a conceptual point of view, the combinatorial nature of the induced optimization problem renders a direct application of this concept difficult. The prototype implementation provided below exhibits a polynomial runtime w.r.t. the patterns and can be applied to arbitrary (linear and non-linear) kernel matrices with fixed rank 2.


Source Code


The code is published under the MIT license. If you use the code for scientific work, please use the reference(s) below to cite us. The source code can be downloaded here.




Version 0.1 (November 2013)




  1. Fabian Gieseke, Tapio Pahikkala, and Christian Igel. Polynomial Runtime Bounds for Fixed-Rank Unsupervised Least-Squares Classification. In Proceedings of the 5th Asian Conference on Machine Learning (ACML). 2013, 62-71.