Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30375
Validity and Reliability of Competency Assessment Implementation (CAI) Instrument Using Rasch Model

Authors: Azmanirah Ab Rahman, Jamil Ahmad, Nurfirdawati Muhamad Hanafi, Marina Ibrahim Mukhtar, Sarebah Warman

Abstract:

This study was conducted to generate empirical evidence on validity and reliability of the item of Competency Assessment Implementation (CAI) Instrument using Rasch Model for polythomous data aided by Winstep software version 3.68. The construct validity was examined by analyzing the point-measure correlation index (PTMEA), infit and outfit MNSQ values; meanwhile the reliability was examined by analyzing item reliability index. A survey technique was used as the major method with the CAI instrument on 156 teachers from vocational schools. The results have shown that the reliability of CAI Instrument items were between 0.80 and 0.98. PTMEA Correlation is in positive values, in which the item is able to distinguish between the ability of the respondent. Statistical data obtained show that out of 154 items, 12 items from the instrument suggested to be omitted. This study is hoped could bring a new direction to the process of data analysis in educational research.

Keywords: Reliability, Validity, competency assessment, item analysis

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1336562

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2193

References:


[1] Ahmad Tajudin Jab(2007), Technical and Vocational Curriculum Development. Kuala Lumpur: Department of Technical Education. Ministry of Education.
[2] Azrilah Abdul Aziz, Azlinah Mohamed, NoorHabibah Arshad, Sohaimi Zakaria, Azami Zaharin, Hamza Ahmad Ghulman & Mohd Saidfuddin Masodi. (2008). Application of Rasch Model in validating the construct of measurement instrument. International Journal of Education and Information Technologies, 2(2), 105-112.
[3] Bond T.G and Fox C.M, (2007), Applying The Rasch Model: Fundamental Measurement in Human Science. London: Lawrence Erlbaum Associates.
[4] Higgins, R., Hartley, P., and Skelton, A. (2002), The Conscientious Consumer: Re Considering The Role of Assessment Feedback on Student. Learning, Studies in Higher Education, 27(1), 382-400.
[5] Katz, A. M., and Gangnon, B., A. (2000), Portfolio Assessment Integrating Goals and Objectives With Learner Outcomes. Assessment 12(1).
[6] Kimberlin, C. L. and Winterstein, A. G. (2008), Validity and Reliability of Measurement Instruments Used in Research. Am J Health-SystPharm. 65(2276-2284).
[7] Linacre J.M. (2003), Winsteps Computer Program Version 3.48. Chicago: www.winsteps.com (10 Sept. 2009).
[8] Pitts, J. et al. (2002). Enhancing Reliability in Portfolio Assessment: Discussion Between Assessors Medical Teacher 24 (2), 197-201.
[9] Richardson R. C. (2008), The Development and validation of a personality instrument to increase collaboration, Educational Research and Review, 3(4), 121-127.
[10] Schumacker R.E. (2005), Item Response Theory. Applied Measurement Associates.
[11] Tigelaar et al. (2005), Quality Issues in Judging Portfolios: Implications for Organizing Teaching Portfolio Assessment Procedures. Studies in Higher Education. 30(5), Number: 595-610.
[12] Technical & Vocational Curriculum Development Department, 2007 & Examination Board, Ministry of Education Malaysia, 2008