Ensembling Classifiers – An Application toImage Data Classification from Cherenkov Telescope Experiment
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32797
Ensembling Classifiers – An Application toImage Data Classification from Cherenkov Telescope Experiment

Authors: Praveen Boinee, Alessandro De Angelis, Gian Luca Foresti

Abstract:

Ensemble learning algorithms such as AdaBoost and Bagging have been in active research and shown improvements in classification results for several benchmarking data sets with mainly decision trees as their base classifiers. In this paper we experiment to apply these Meta learning techniques with classifiers such as random forests, neural networks and support vector machines. The data sets are from MAGIC, a Cherenkov telescope experiment. The task is to classify gamma signals from overwhelmingly hadron and muon signals representing a rare class classification problem. We compare the individual classifiers with their ensemble counterparts and discuss the results. WEKA a wonderful tool for machine learning has been used for making the experiments.

Keywords: Ensembles, WEKA, Neural networks [NN], SupportVector Machines [SVM], Random Forests [RF].

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1070941

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1717

References:


[1] http://wwwmagic.mppmu.mpg.de.
[2] Bock, R.K., et al.: Methods for multidimensional event classification: a case study using images from a Cherenkov gamma-ray telescope, Nucl. Instr. Methods A516 (2004) 511.
[3] Dietterich, T.G.: Machine Learning. In Nature Encyclopedia of Cognitive Science, Macmillan, London (2003).
[4] Breiman, L.: Bagging Predictors: Machine Learning (1996) 24:123- 140.
[5] Carney, J., Cunningham, P. The NeuralBAG algorithm: optimizing generalization performance in Bagged Neural Networks. In: Verleysen, M. (eds.): Proceedings of the 7th European Symposium on Artificial Neural Networks (1999), pp. 3540.
[6] Freund, Y., Schapire, RE.: Experiments with a new boosting algorithm. In Proceedings 13th International Conference on Machine Learning (1996) 148-156.
[7] Skurichina, M., Duin, R.P.W.: Bagging, Boosting and the Random Subspace Method for Linear Classifiers, Vol. 5, no. 2, Pattern Analysis and Applications (2002) 121-135.
[8] Ian H. Witten and Eibe Frank: Data Mining: Practical machine learning tools and techniques, 2nd Edition, Morgan Kaufmann, San Francisco, 2005.
[9] Schwenk, H., Bengio, Y.: Boosting Neural Networks, Vol. 12, Issue 8, Neural Computation (2000).
[10] Breiman, L.: Random Forests Technical Report, University of California, 2001.
[11] Platt, J.: Fast training of support vector machines using sequential minimal optimization. In: Scholkopf, B., Burges, C., Smola, A. (eds.): Advances in Kernel Methods - Support Vector Learning, MIT Press (1998).
[12] Rong, Y., Yan, L., Rong, J., Hauptmann, A.: On predicting rare classes with SVM ensembles in scene classification: Proceedings of Acoustics, Speech, and Signal Processing, (ICASSP'03) 2003 IEEE International Conference.