Complementing Assessment Processes with Standardized Tests: A Work in Progress
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32799
Complementing Assessment Processes with Standardized Tests: A Work in Progress

Authors: Amparo Camacho

Abstract:

ABET accredited programs must assess the development of student learning outcomes (SOs) in engineering programs. Different institutions implement different strategies for this assessment, and they are usually designed “in house.” This paper presents a proposal for including standardized tests to complement the ABET assessment model in an engineering college made up of six distinct engineering programs. The engineering college formulated a model of quality assurance in education to be implemented throughout the six engineering programs to regularly assess and evaluate the achievement of SOs in each program offered. The model uses diverse techniques and sources of data to assess student performance and to implement actions of improvement based on the results of this assessment. The model is called “Assessment Process Model” and it includes SOs A through K, as defined by ABET. SOs can be divided into two categories: “hard skills” and “professional skills” (soft skills). The first includes abilities, such as: applying knowledge of mathematics, science, and engineering and designing and conducting experiments, as well as analyzing and interpreting data. The second category, “professional skills”, includes communicating effectively, and understanding professional and ethnical responsibility. Within the Assessment Process Model, various tools were used to assess SOs, related to both “hard” as well as “soft” skills. The assessment tools designed included: rubrics, surveys, questionnaires, and portfolios. In addition to these instruments, the Engineering College decided to use tools that systematically gather consistent quantitative data. For this reason, an in-house exam was designed and implemented, based on the curriculum of each program. Even though this exam was administered during various academic periods, it is not currently considered standardized. In 2017, the Engineering College included three standardized tests: one to assess mathematical and scientific reasoning and two more to assess reading and writing abilities. With these exams, the college hopes to obtain complementary information that can help better measure the development of both hard and soft skills of students in the different engineering programs. In the first semester of 2017, the three exams were given to three sample groups of students from the six different engineering programs. Students in the sample groups were either from the first, fifth, and tenth semester cohorts. At the time of submission of this paper, the engineering college has descriptive statistical data and is working with various statisticians to have a more in-depth and detailed analysis of the sample group of students’ achievement on the three exams. The overall objective of including standardized exams in the assessment model is to identify more precisely the least developed SOs in order to define and implement educational strategies necessary for students to achieve them in each engineering program.

Keywords: Assessment, hard skills, soft skills, standardized tests.

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1316139

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 750

References:


[1] J. Fleming & C. Morning, “Correlates of the SAT in Minority Engineering Students: An Exploratory Study,” (online) The Journal of Higher Education, vol. 69, no. 1, pp. 89-108. Jan 1998. Available: http://www.jstor.org/stable/2649183
[2] A. Lawson, H. Adi & R. Karplus, “Development of Correlational Reasoning in Secondary Schools: Do Biology Courses Make a Difference?” (online) The American Biology Teacher, vol. 41, no. 7, pp. 420-425, Oct. 1979. Available: http://doi.org/10.2307/4446678
[3] C. Fernández & S. Linares, “De la Estructura Aditiva a la Multiplicativa: Efecto de Dos Variables en el Desarrollo del Razonamiento Proporcional”, (¬¬¬¬¬¬¬¬online) Infancia y Aporendizaje, vol. 34, no. 1, pp. 67-80, Jan. 2011. Available: https://doi.org/10.1174/021037011794390111
[4] D. Kuhn, K. Iordanou, M. Pease & C. Wirkala, “Beyond Control of Variables: What Needs to Develop to Achieve Skilled Scientific thinking?” (online) Cognitive Development, vol. 23, no. 4, pp. 435-451, Dec. 2008. Available: https://doi.org/10.1016/j.cogdev.2008.09.006
[5] B. Inhelder & J. Paiget, “De la Lógica del Niño a la Lógica del Adolescente”, Barcelona: España Edisiones Paidós Ibérica, 1955.
[6] J. Morales, Operaciones Combinatorias en Adolescentes y Jóvenes Universitarios (online). Peru: Pontificia Universidad Católica del Perú, 2013. Available: http://tesis.pucp.edu.pe/repositorio/handle/123456789/4703
[7] J. Trifone, “The Test of Logical Thinking: Applications for Teaching and Placing Science Students, The American Biology Teacher, vol. 49, no. 8, pp. 411-416, Dec. 1987.
[8] M. Reyzábal, “Las Competencias Comunicativas y Lingüísticas, Clave para la Calidad Educativa”, (online) REICE, vol. 10, no. 4, pp. 68, 2012. Available: http://www.rinace.net/reice/numeros/arts/vol10num4/art5.pdf
[9] ABET, “Criteria for Accrediting Engineering Programs, 2017 - 2018” (online) ABET, 2016. Available: http://www.abet.org/accreditation/accreditation-criteria/criteria-for-accrediting-engineering-programs-2017-2018/
[10] B. M. Moskal, “Scoring rubrics: what, when and how?” (online) Practical Assessment, Research & Evaluation, vol. 7, no. 3, Sept 2005. Available: http://PAREonline.net/getvn.asp?v=7&n=3
[11] ABET, “Criteria for Accrediting Engineering Programs,” (online) ABET. Oct 2016. Available: http://www.abet.org/wp-content/uploads/2016/12/E001-17-18-EAC-Criteria-10-29-16-1.pdf
[12] C. A. Mertler, “Designing scoring rubrics for your classroom,” (Online) Practical Assessment, Research & Evaluation, vol. 7, no. 25, 2001. Available: http://PAREonline.net/getvn.asp?v=7&n=25
[13] B. M. Moskal, “Scoring rubrics: what, when and how?” (online) Practical Assessment, Research & Evaluation, vol. 7, no. 3, Sept 2005. Available: http://PAREonline.net/getvn.asp?v=7&n=3
[14] L. Brodie, and P. Gibbings. “Comparison of PBL Assessment Rubrics,” Proceedings of the Research in Engineering Education Symposium, 2009, Palm Cove, QLD.
[15] ABET, “General Criterion 3. Student Outcomes,” (Online) ABET. 2017 – 2018. Available: http://www.abet.org/accreditation/accreditation-criteria/criteria-for-accrediting-engineering-programs-2017-2018/