R. Polikar, Bootstrap Inspired Techniques in Computational Intelligence IEEE Signal Processing Magazine, pp.59-73, 2007.

B. Efron, Bootstrap Methods: Another Look at the Jackknife, The Annals of Statistics, vol.7, issue.1, pp.1-26, 1979.
DOI : 10.1214/aos/1176344552

R. E. Shapire, The Strength of weak learnability. Machine Learning, pp.197-227, 1990.

Y. Freund and R. E. Shapire, A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting, Journal of Computer and System Sciences, vol.55, issue.1, pp.119-139, 1997.
DOI : 10.1006/jcss.1997.1504

E. Zio, Computational methods for reliability and risk analysis, World Scientifics, vol.14, 2009.
DOI : 10.1142/7190

P. J. Boland, Majority Systems and the Condorcet jury problem, Statistician, vol.30, issue.3, pp.181-189, 1989.

B. Yuan and G. Klir, Data-Driven Identification of Key Variables, Intelligent Hybrid Systems Fuzzy Logic, Neural Network, and Genetic Algorithms, pp.161-187, 1997.
DOI : 10.1007/978-1-4615-6191-0_7

E. Zio and P. Baraldi, Identification of nuclear transients via optimized fuzzy clustering, Annals of Nuclear Energy, vol.32, issue.10, pp.1068-1080
DOI : 10.1016/j.anucene.2005.02.012

T. K. Ho, Random Subspace method for constructing decision forests, IEEE Trans. Pattern Anal. Machine Intell, vol.20, issue.8, pp.832-844, 1998.

J. Depasquale and R. Polikar, Random Feature Subset Selection for Ensemble Based Classification of Data with Missing Features, Lecture Notes in Computer Science, vol.4472, pp.251-260, 2007.
DOI : 10.1007/978-3-540-72523-7_26

L. Kuncheva, Combining Pattern Classifiers: Methods and Algorithms, IEEE Transactions on Neural Networks, vol.18, issue.3, 2004.
DOI : 10.1109/TNN.2007.897478

F. Tang, Grinding Wheel Condition Monitoring With Boosted Classifiers, 2006.