An Introduction to Semi-supervised learning
Semi-supervised learning
It is a training approach that uses labelled data and a relatively more number of unlabelled data to build a classifier. This finds applications in cases where it is expensive or infeasible to label large amounts of data while unlabelled data is easily available. For example, in semantic segmentation of satellite images, since it requires each pixel to be labelled, semi-supervised learning has shown that the accuracy of segmentation can be improved with the use of unlabelled data along with a limited number of labelled data.
This semi-supervised learning method is found based on a few assumptions. They are:
- Smoothness/continuity - Two close points x1 and x2 in a high-density region should mean that the respective labels or targets y1 and y2 should remain closer as well.
- Cluster assumption - The decision boundary should be in a low-density region.
- Manifold assumption - The high-dimensional data lie on a low-dimensional manifold.
- Transductive learning predicts the class label of unlabeled data.
- Inductive learning predicts the mapping or function between the input and target labels.
Semi-Supervised learning common methods include Consistency regularization, Pseudo-labelling and Graph-based methods.
Consistency Regularization
This is a common technique in semi-supervised learning. It expects that if a small perturbation is applied to the unlabelled data points, the target should not change significantly. This is also called Consistency training.
Pseudo-Labelling
This method involves making use of a trained model on the labelled set of data to predict the targets for unlabelled data and the formed samples are further used as additional training samples.
Graph-based methods
In this class of methods, the labelled and unlabelled data points are considered nodes in a graph and the labels are propagated from labelled nodes to the unlabelled nodes based on the similarity of the nodes.
References :
[1] An Overview of Deep Semi-Supervised Learning
Comments
Post a Comment