AdaBoost Parallelization on PC Clusters with Virtual Shared Memory for Fast Feature Selection

Abstract : Feature selection is a key issue in many machine learning applications and the need to test lots of candidate features is real while computational time required to do so is often huge. In this paper, we introduce a parallel version of the well- known AdaBoost algorithm to speed up and size up feature selection for binary classification tasks using large training datasets and a wide range of elementary features. This parallelization is done without any modification to the AdaBoost algorithm and designed for PC clusters using Java and the JavaSpace distributed framework. JavaSpace is a memory sharing paradigm implemented on top of a virtual shared memory, that appears both efficient and easy-to-use. Results and performances on a face detection system trained with the proposed parallel AdaBoost are presented.
Type de document :
Communication dans un congrès
IEEE International Conference on Signal Processing and Communication, Nov 2007, Dubai, United Arab Emirates. pp.165-168, 2007
Liste complète des métadonnées

Littérature citée [8 références]  Voir  Masquer  Télécharger

https://hal-supelec.archives-ouvertes.fr/hal-00216041
Contributeur : Sébastien Van Luchene <>
Soumis le : vendredi 25 janvier 2008 - 21:48:59
Dernière modification le : jeudi 29 mars 2018 - 11:06:04
Document(s) archivé(s) le : jeudi 15 avril 2010 - 16:07:25

Fichier

Supelec334.pdf
Fichiers éditeurs autorisés sur une archive ouverte

Identifiants

  • HAL Id : hal-00216041, version 1

Collections

Citation

Virginie Galtier, Olivier Pietquin, Stéphane Vialle. AdaBoost Parallelization on PC Clusters with Virtual Shared Memory for Fast Feature Selection. IEEE International Conference on Signal Processing and Communication, Nov 2007, Dubai, United Arab Emirates. pp.165-168, 2007. 〈hal-00216041〉

Partager

Métriques

Consultations de la notice

264

Téléchargements de fichiers

154