Traffic Flow Prediction using Adaboost Algorithm with Random Forests as a Weak Learner
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33092
Traffic Flow Prediction using Adaboost Algorithm with Random Forests as a Weak Learner

Authors: Guy Leshem, Ya'acov Ritov

Abstract:

Traffic Management and Information Systems, which rely on a system of sensors, aim to describe in real-time traffic in urban areas using a set of parameters and estimating them. Though the state of the art focuses on data analysis, little is done in the sense of prediction. In this paper, we describe a machine learning system for traffic flow management and control for a prediction of traffic flow problem. This new algorithm is obtained by combining Random Forests algorithm into Adaboost algorithm as a weak learner. We show that our algorithm performs relatively well on real data, and enables, according to the Traffic Flow Evaluation model, to estimate and predict whether there is congestion or not at a given time on road intersections.

Keywords: Machine Learning, Boosting, Classification, TrafficCongestion, Data Collecting, Magnetic Loop Detectors, SignalizedIntersections, Traffic Signal Timing Optimization.

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1060207

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3908

References:


[1] Breiman Leo. (2001) {Random Forests} Machine Learning 45 (1), 5-32 (Original Article).
[2] Breiman Leo. (2003) {Manual on setting up, using, and understanding random forestsv3.1}.
[3] Drucker H, Wu D, Vapnik V. (1999) {Support vector machines for spam categorization}. IEEE Transactions on Neural Networks.
[4] Katja Remlinger. (2002) {Paper report base on Breiman, L.(2001) Random Forests}. Machine Learning, 35, 5-32.
[5] Iyer R, Lewis D, Schapire R, Singer Y and Singhal A (2000) {Boosting for document routing}. In Proceedings of the Ninth International Conference on Information and Knowledge Management.
[6] J. R. Quinlan. {Bagging, boosting and C4.5}. Proc. of 13th AAAI, pp. 725730, 1996.
[7] J. Friedman, T.Hastie, R. Tibshirani. (1998) {Logistic Regression: a Statistical View of Boosting}. Tech. report July 23.
[8] Marko Robnik-Sikonja. (2004) {Improving Random Forests}. In J.F. Boulicaut et al.(eds): Machine Learning, ECML 2004.
[9] Schapire E. Robert. (1990) {The Strength of Weak Learnability}. Machine Learning, 5, 197-227.
[10] Theodoro Koulis. (2003) {Random Forests: Presentation Summary}. April 1.
[11] T. G. Dietterich. (1999) {An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization}. Machine Learning, Vol. 32, No.1, pp.122.
[12] Transportation Research Board.{Highway Capacity Manual, National Research Council, Washington, DC}, 2.
[13] Mahalel David, Gal-Tzur Ayelet, (2003) {Evaluation of the traffic flow In the signalized intersections. Final report of the Division of Transportation and Highways Engineering, Technion Institute}.
[14] Yang Qi, (1997) {A Simulation Laboratory for Evaluation of Dynamic Traffic Management Systems. Ph.D. Thesis, at Department of Civil and Environmental Engineering}.
[15] Anil Kamarajugadda, Byungkyu Park, (2003){Stochastic Traffic Signal Timing Optimization. Final report of ITS Center project: Signal timing algorithm}.
[16] Y. Freund, R. E. Schapire (1996), Experiments with a New Boosting Algorithm. In Machine Learning: Proceedings of the Thirteenth International Conference.