Software & Data Downloads — SSTL

Semi-Supervised Transfer Learning for allow the use of unlabeled data to improve transfer learning.

Successful state-of-the-art machine learning techniques rely on the existence of large well sampled and labeled datasets. Today it is easy to obtain a finely sampled dataset because of the decreasing cost of connected low-energy devices. However, it is often difficult to obtain a large number of labels. The reason for this is two-fold. First, labels are often provided by people whose attention span is limited. Second, even if a person was able to label perpetually, this person would need to be shown data in a large variety of conditions. One approach to addressing these problems is to combine labeled data collected in different sessions through transfer learning. Still even this approach suffers from dataset limitations.

This code allows the use of unlabeled data to improve transfer learning in the case where: the training and testing datasets are drawn from similar probability distributions; and the unlabeled data in each dataset can be described by similar underlying manifolds. The code implements a distribution free, kernel and graph Laplacian-based approach which optimizes empirical risk in the appropriate reproducing kernel Hilbert space. The approach presented in this code was published in the 2018 IEEE Data Science workshop in a paper titled "Semi-Supervised Transfer Learning Using Marginal Predictors".

  •  Deshmukh, A., Laftchiev, E., "Semi-Supervised Transfer Learning Using Marginal Predictors", IEEE Data Science Workshop, DOI: 10.1109/​DSW.2018.8439908, June 6, 2018, pp. 160-164.
    BibTeX TR2018-040 PDF Software
    • @inproceedings{Deshmukh2018jun,
    • author = {Deshmukh, Aniket and Laftchiev, Emil},
    • title = {Semi-Supervised Transfer Learning Using Marginal Predictors},
    • booktitle = {IEEE Data Science Workshop},
    • year = 2018,
    • pages = {160--164},
    • month = jun,
    • doi = {10.1109/DSW.2018.8439908},
    • url = {}
    • }

Access software at