A Novel AdaBoost Framework With Robust Threshold and Structural Optimization

IEEE Trans Cybern. 2018 Jan;48(1):64-76. doi: 10.1109/TCYB.2016.2623900. Epub 2016 Nov 24.

Abstract

The AdaBoost algorithm is a popular ensemble method that combines several weak learners to boost generalization performance. However, conventional AdaBoost.RT algorithms suffer from the limitation that the threshold value must be manually specified rather than chosen through a self-adaptive mechanism, which cannot guarantee a result in an optimal model for general cases. In this paper, we present a generic AdaBoost framework with robust threshold mechanism and structural optimization on regression problems. The error statistics of each weak learner on one given problem dataset is utilized to automate the choice of the optimal cut-off threshold value. In addition, a special single-layer neural network is employed to provide a second opportunity to further adjust the structure and strength the adaption capability of the AdaBoost regression model. Moreover, to consolidate the theoretical foundation of AdaBoost algorithms, we are the first to conduct a rigorous and comprehensive theoretical analysis on the proposed approach. We prove that the general bound on the empirical error with a fraction of training examples is always within a limited soft margin, which indicates that our novel algorithm can avoid over-fitting. We further analyze the bounds on the generalization error directly under probably approximately correct learning. The extensive experimental verifications on the UCI benchmarks have demonstrated that the performance of the proposed method is superior to other state-of-the-art ensemble and single learning algorithms. Furthermore, a real-world indoor positioning application has also revealed that the proposed method has higher positioning accuracy and faster speed.