Record:   Prev Next
Author Zhu, Xiaojin
Title Introduction to semi-supervised learning [electronic resource] / Xiaojin Zhu and Andrew B. Goldberg
Imprint San Rafael, Calif. (1537 Fourth Street, San Rafael, CA 94901 USA) : Morgan & Claypool Publishers, c2009
book jacket
Descript 1 electronic text (xi, 116 p. : ill.) : digital file
Series Synthesis lectures on artificial intelligence and machine learning, 1939-4616 ; # 6
Synthesis lectures on artificial intelligence and machine learning (Online), 1939-4616 ; # 6
Note Part of: Synthesis digital library of engineering and computer science
Title from PDF t.p. (viewed on July 8, 2009)
Series from website
Includes bibliographical references (p. 95-112) and index
Introduction to statistical machine learning -- The data -- Unsupervised learning -- Supervised learning -- Overview of semi-supervised learning -- Learning from both labeled and unlabeled data -- How is semi-supervised learning possible -- Inductive vs. transductive semi-supervised learning -- Caveats -- Self-training models -- Mixture models and EM -- Mixture models for supervised classification -- Mixture models for semi-supervised classification -- Optimization with the EM algorithm -- The assumptions of mixture models -- Other issues in generative models -- Cluster-then-label methods -- Co-training -- Two views of an instance -- Co-training -- The assumptions of co-training -- Multiview learning -- Graph-based semi-supervised learning -- Unlabeled data as stepping stones -- The graph -- Mincut -- Harmonic function -- Manifold regularization -- The assumption of graph-based methods -- Semi-supervised support vector machines -- Support vector machines -- Semi-supervised support vector machines -- Entropy regularization -- The assumption of S3VMS and entropy regularization -- Human semi-supervised learning -- From machine learning to cognitive science -- Study one: humans learn from unlabeled test data -- Study two: presence of human semi-supervised learning in a simple task -- Study three: absence of human semi-supervised learning in a complex task -- Discussions -- Theory and outlook -- A simple PAC bound for supervised learning -- A simple PAC bound for semi-supervised learning -- Future directions of semi-supervised learning -- Basic mathematical reference -- Semi-supervised learning software -- Symbols -- Biography
Abstract freely available; full-text restricted to subscribers or individual document purchasers
Google scholar
Google book search
Mode of access: World Wide Web
System requirements: Adobe Acrobat reader
Semi-supervised learning is a learning paradigm concerned with the study of how computers and natural systems such as humans learn in the presence of both labeled and unlabeled data. Traditionally, learning has been studied either in the unsupervised paradigm (e.g., clustering, outlier detection) where all the data is unlabeled, or in the supervised paradigm (e.g., classification, regression) where all the data is labeled. The goal of semi-supervised learning is to understand how combining labeled and unlabeled data may change the learning behavior, and design algorithms that take advantage of such a combination. Semi-supervised learning is of great interest in machine learning and data mining because it can use readily available unlabeled data to improve supervised learning tasks when the labeled data is scarce or expensive. Semi-supervised learning also shows potential as a quantitative tool to understand human category learning, where most of the input is self-evidently unlabeled. In this introductory book, we present some popular semi-supervised learning models, including self-training, mixture models, co-training and multiview learning, graph-based methods, and semisupervised support vector machines. For each model, we discuss its basic mathematical formulation. The success of semi-supervised learning depends critically on some underlying assumptions. We emphasize the assumptions made by each model and give counterexamples when appropriate to demonstrate the limitations of the different models. In addition, we discuss semi-supervised learning for cognitive psychology. Finally, we give a computational learning theoretic perspective on semisupervised learning, and we conclude the book with a brief discussion of open questions in the field
Also available in print
Subject Supervised learning (Machine learning)
Support vector machines
Semi-supervised learning
Transductive learning
Gaussian mixture model
Expectation maximization (EM)
Multiview learning
Harmonic function
Label propagation
Manifold regularization
Semi-supervised support vector machines (S3VM)
Transductive support vector machines (TSVM)
Entropy regularization
Human semi-supervised learning
Alt Author Goldberg, Andrew B
Vari Title Synthesis digital library of engineering and computer science
Record:   Prev Next