Record:   Prev Next
Author Rokach, Lior
Title Pattern Classification Using Ensemble Methods
Imprint Singapore : World Scientific Publishing Co Pte Ltd, 2009
©2009
book jacket
Descript 1 online resource (242 pages)
text txt rdacontent
computer c rdamedia
online resource cr rdacarrier
Series World Scientific Studies In International Economics ; v.75
World Scientific Studies In International Economics
Note Intro -- Contents -- Preface -- 1. Introduction to Pattern Classi.cation -- 1.1 Pattern Classification -- 1.2 Induction Algorithms -- 1.3 Rule Induction -- 1.4 Decision Trees -- 1.5 Bayesian Methods -- 1.5.1 Overview. -- 1.5.2 NaıveBayes -- 1.5.2.1 The Basic Naıve Bayes Classifier -- 1.5.2.2 Naıve Bayes Induction for Numeric Attributes -- 1.5.2.3 Correction to the Probability Estimation -- 1.5.2.4 Laplace Correction -- 1.5.2.5 No Match -- 1.5.3 Other Bayesian Methods -- 1.6 Other Induction Methods -- 1.6.1 Neural Networks -- 1.6.2 Genetic Algorithms -- 1.6.3 Instance-based Learning -- 1.6.4 Support Vector Machines -- 2. Introduction to Ensemble Learning -- 2.1 Back to the Roots -- 2.2 The Wisdom of Crowds -- 2.3 The Bagging Algorithm -- 2.4 The Boosting Algorithm -- 2.5 The AdaBoost Algorithm -- 2.6 No Free Lunch Theorem and Ensemble Learning -- 2.7 Bias-Variance Decomposition and Ensemble Learning -- 2.8 Occam's Razor and Ensemble Learning -- 2.9 Classifier Dependency -- 2.9.1 DependentMethods -- 2.9.1.1 Model-guided Instance Selection -- 2.9.1.2 Basic Boosting Algorithms -- 2.9.1.3 Advanced Boosting Algorithms -- 2.9.1.4 Incremental Batch Learning -- 2.9.2 Independent Methods -- 2.9.2.1 Bagging -- 2.9.2.2 Wagging -- 2.9.2.3 Random Forest and Random Subspace Projection -- 2.9.2.4 Non-Linear Boosting Projection (NLBP) -- 2.9.2.5 Cross-validated Committees -- 2.9.2.6 Robust Boosting -- 2.10 Ensemble Methods for Advanced Classification Tasks -- 2.10.1 Cost-Sensitive Classification -- 2.10.2 Ensemble for Learning Concept Drift -- 2.10.3 Reject Driven Classification -- 3. Ensemble Classification -- 3.1 Fusions Methods -- 3.1.1 Weighting Methods -- 3.1.2 Majority Voting -- 3.1.3 Performance Weighting -- 3.1.4 Distribution Summation -- 3.1.5 Bayesian Combination -- 3.1.6 Dempster-Shafer -- 3.1.7 Vogging -- 3.1.8 Naıve Bayes -- 3.1.9 Entropy Weighting
3.1.10 Density-based Weighting -- 3.1.11 DEA Weighting Method -- 3.1.12 Logarithmic Opinion Pool -- 3.1.13 Order Statistics -- 3.2 Selecting Classification -- 3.2.1 Partitioning the Instance Space -- 3.2.1.1 The K-Means Algorithm as a Decomposition Tool -- 3.2.1.2 Determining the Number of Subsets -- 3.2.1.3 The Basic K-Classifier Algorithm -- 3.2.1.4 The Heterogeneity Detecting K-Classifier (HDK-Classifier) -- 3.2.1.5 Running-Time Complexity -- 3.3 Mixture of Experts and Meta Learning -- 3.3.1 Stacking -- 3.3.2 Arbiter Trees -- 3.3.3 Combiner Trees -- 3.3.4 Grading -- 3.3.5 Gating Network -- 4. Ensemble Diversity -- 4.1 Overview -- 4.2 Manipulating the Inducer -- 4.2.1 Manipulation of the Inducer's Parameters -- 4.2.2 Starting Point in Hypothesis Space -- 4.2.3 Hypothesis Space Traversal -- 4.3 Manipulating the Training Samples -- 4.3.1 Resampling -- 4.3.2 Creation -- 4.3.3 Partitioning -- 4.4 Manipulating the Target Attribute Representation -- 4.4.1 Label Switching -- 4.5 Partitioning the Search Space -- 4.5.1 Divide and Conquer -- 4.5.2 Feature Subset-based Ensemble Methods -- 4.5.2.1 Random-based Strategy -- 4.5.2.2 Reduct-based Strategy -- 4.5.2.3 Collective-Performance-based Strategy -- 4.5.2.4 Feature Set Partitioning -- 4.5.2.5 Rotation Forest -- 4.6 Multi-Inducers -- 4.7 Measuring the Diversity -- 5. Ensemble Selection -- 5.1 Ensemble Selection -- 5.2 Pre Selection of the Ensemble Size -- 5.3 Selection of the Ensemble Size While Training -- 5.4 Pruning - Post Selection of the Ensemble Size -- 5.4.1 Ranking-based -- 5.4.2 Search based Methods -- 5.4.2.1 Collective Agreement-based Ensemble PruningMethod -- 5.4.3 Clustering-based Methods -- 5.4.4 Pruning Timing -- 5.4.4.1 Pre-combining Pruning -- 5.4.4.2 Post-combining Pruning -- 6. Error Correcting Output Codes -- 6.1 Code-matrix Decomposition of Multiclass Problems
6.2 Type I - Training an Ensemble Given a Code-Matrix -- 6.2.1 Error correcting output codes -- 6.2.2 Code-Matrix Framework -- 6.2.3 Code-matrix Design Problem -- 6.2.4 Orthogonal Arrays (OA) -- 6.2.5 Hadamard Matrix -- 6.2.6 Probabilistic Error Correcting Output Code -- 6.2.7 Other ECOC Strategies -- 6.3 Type II - Adapting Code-matrices to the Multiclass Problems -- 7. Evaluating Ensembles of Classifiers -- 7.1 Generalization Error -- 7.1.1 Theoretical Estimation of Generalization Error -- 7.1.2 Empirical Estimation of Generalization Error -- 7.1.3 Alternatives to the Accuracy Measure -- 7.1.4 The F-Measure -- 7.1.5 Confusion Matrix -- 7.1.6 Classifier Evaluation under Limited Resources -- 7.1.6.1 ROC Curves -- 7.1.6.2 Hit Rate Curve -- 7.1.6.3 Qrecall (Quota Recall) -- 7.1.6.4 Lift Curve -- 7.1.6.5 Pearson Correlation Coefficient -- 7.1.6.6 Area Under Curve (AUC) -- 7.1.6.7 Average Hit Rate -- 7.1.6.8 Average Qrecall -- 7.1.6.9 Potential Extract Measure (PEM) -- 7.1.7 Statistical Tests for Comparing Ensembles -- 7.1.7.1 McNemar's Test -- 7.1.7.2 A Test for the Difference of Two Proportions -- 7.1.7.3 The Resampled Paired t Test -- 7.1.7.4 The k-fold Cross-validated Paired t Test -- 7.2 Computational Complexity -- 7.3 Interpretability of the Resulting Ensemble -- 7.4 Scalability to Large Datasets -- 7.5 Robustness -- 7.6 Stability -- 7.7 Flexibility -- 7.8 Usability -- 7.9 Software Availability -- 7.10 Which Ensemble Method Should be Used? -- Bibliography -- Index
Key Features:Provides state-of-the-art reviews of the most significant ensemble methodsDescribes a new unified taxonomy to categorize ensemble methodsHighlights detailed pseudo-code and illustration of the most popular algorithms
Description based on publisher supplied metadata and other sources
Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2020. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries
Link Print version: Rokach, Lior Pattern Classification Using Ensemble Methods Singapore : World Scientific Publishing Co Pte Ltd,c2009 9789814271066
Subject Pattern recognition systems.;Algorithms.;Machine learning
Electronic books
Record:   Prev Next