Fast and Accuracy Control Chart Pattern Recognition using a New cluster-k-Nearest Neighbor
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33123
Fast and Accuracy Control Chart Pattern Recognition using a New cluster-k-Nearest Neighbor

Authors: Samir Brahim Belhaouari

Abstract:

By taking advantage of both k-NN which is highly accurate and K-means cluster which is able to reduce the time of classification, we can introduce Cluster-k-Nearest Neighbor as "variable k"-NN dealing with the centroid or mean point of all subclasses generated by clustering algorithm. In general the algorithm of K-means cluster is not stable, in term of accuracy, for that reason we develop another algorithm for clustering our space which gives a higher accuracy than K-means cluster, less subclass number, stability and bounded time of classification with respect to the variable data size. We find between 96% and 99.7 % of accuracy in the lassification of 6 different types of Time series by using K-means cluster algorithm and we find 99.7% by using the new clustering algorithm.

Keywords: Pattern recognition, Time series, k-Nearest Neighbor, k-means cluster, Gaussian Mixture Model, Classification

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1073279

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1971

References:


[1] B. V. Dasarsthy, Ed., Nearest Neighbor (NN) Norms: NN Pattern classification techniques. Los Alamitos, CA:IEEE Computer Society Press,1990.
[2] P. J. Hardin, Parametric and Nearest Neighbor methods for hybrid classification: A comparaison of pixel assignment accuracy, photogramm Eng. Remote Sens., Vol. 60 pp. 1439-1448, Dec. 1994 .
[3] T. P. Yunck, A technique to identify NN, IEEE Trans. Syst., Man, Cybern,. Vol. SMC-6, no. 10, pp. 678-683, 1976.
[4] X. Jia and J. A. Richards, Cluster-space representation for hyperspectral data classification, IEEE Trans, Geosci. Remote Sens., vol. 40, no. 3, pp. 593-598, Mar. 2002.
[5] X. Jia and J. A. richards, Fast k-NN classification Using the Cluster-Space Approach, IEEE Trans., Geosci. Remote Sens., vol. 2, no. 2, April 2005.
[6] K. wagstaff and S. Rogers, Constrained k-means Clustering with Background Knowledge, Proc. of the 8th International conference on Machine learning, p. 577-584, 2001
[7] Pentakalos, O., Menasce, D., and Yesha, Y., Automated clustering-based workload characterization, Joint Sixth NASA Goddard Space Flight Center Conference on Mass Storage Systems and Technologies and Fifth IEEE Symposium on Mass Storage Systems, College Park, MD, March 23-26, 1998.
[8] Jain, A., Murty, M., Flynn, P., Data Clustering: A Review, http://scgwiki.iam.unibe.ch:8080/SCG/uploads/596/p264-jain.pdf
[9] Eisen, Michael. Cluster 3.0 Manual. http://bonsai.ims.u-tokyo.ac.jp/ mdehoon/ software/cluster/cluster3.pdf.
[10] Hale. Introductory Statistics, http://espse.ed.psu.edu/statistics.
[11] R. O. Duda, P. E. Hart, and D. G. Stork, Pattern Classification, 2nd ed. New York: Wiley, 2000.
[12] I. T. Nabney, Netlab, Algorithms for Pattern Recognition. London, U.K.: Springer-Verlag, 2003.
[13] D. J. C. MacKay. (1997). Gaussian processesA replacement for supervised neural networks, in Lecture Notes TutorialNIPS .
[Online]. Available: http://www.inference.phy.cam.ac.uk/mackay/gp.pdf
[14] D. M. Titterington, A. F. M. Smith, and U. E. Makov, Statistical Analysis of Finite Mixture Distributions. John Wiley, New York, 1985.
[15] C. M. Bishop, Neural Networks for Pattern Recognition. Clarendon Press, Oxford, 1995.
[16] S. Brahim-Belhouari and A. Bermak, Pattern Recognition Letters. Elsevier Science Publisher, 2005.
[17] S. Brahim-Belhouari,Modified k-means cluster. University technology of PETRONAS, 2008.