Karol Jiménez Alfaro, Guaner Rojas Rojas, Armel Brizuela Rodríguez  y Nelson Pérez Rojas
Universidad de Costa Rica, Costa Rica

 

Abstract

The premise of this study is that a cognitive model can increase students’ performance for solving items on a standardized test, where strategies play a crucial role. The purpose of the study is to validate a cognitive model with four strategies defined by expert judges based on the response processes that underlie the items of the University of Costa Rica (UCR) Academic Aptitude Test. Eight semi-structured interviews were conducted with first-year students at UCR and the techniques of verbal reporting were applied to gather evidence of the items’ response processes. The reports were analyzed to verify the correspondence between the framework of the strategies previously defined by expert judges and the answers given by the participants. In light of the results, it was concluded that the participants followed the proposed strategies for solving the situations posed, and, therefore, the items are indicators of the processes underlying these strategies. The results open the possibility of implementing research with attributes present in each strategy proposed in this study, which will allow predicting test scores in academic performance at the UCR.

Keywords: admission test, response processes, verbal report, reasoning in mathematical context, cognitive model

Access

References

American Educational Research Association (AERA), American Psychological Association (APA) y National Council on Measurement in Education (NCME) (2014). Standards for Educational and Psychological Testing. Washington, Estados Unidos: American Educational Research Association.

Alda, F., & Hernández, M. D. (Enero, 1988). Resolución de problemas. Cuadernos de Pedagogía, 265, 28-32.

Baker, F.B, & Kim, S.H. (2004). Item Response Theor:y Parameters Estimation Techniques. New York: Marcel Dekker, Inc.

Bond, T., & Fox, C. M. (2015). Applying the Rasch model: Fundamental measurement in the human sciences. New York: Routledge.

Borsboom, D., Mellenbergh, G.J., & Heerden, J. (2004). The Concept of Validity. Psychological Review, 111(4), 1061–1071.

Brizuela, A., Jiménez, K., Pérez, N. & Rojas, G. (2016). Autorreportes verbales en voz alta para la identificación de procesos de razonamiento en pruebas estandarizadas. Revista Costarricense de Psicología, 35(1), 17-30.

Castillo, M., & Padilla, J. (2013). How cognitive interviewing can provide validity evidence of the response processes to scale items. Social indicators research, 114(3), 963-975.

Cerdas, D., & Montero, E. (2016). Uso del modelo de Rasch para la construcción de tablas de especificaciones: propuesta metodológica aplicada a una prueba de selección universitaria. Actualidades Investigativas en Educación, 17(1). 1-16.

Cromley, J., & Azevedo, R. (2006). Self-report of reading comprehension strategies: What are we measuring? Metacognition and Learning, 1(3), 229-247.

Cronbach, L.J., & Meehl, P.E. (1955). Construct Validity in Psychological Test. Psychological Bulletin, 52(2), 281-302.

Cui, Y., & Roduta, M. (2013). Validating Student Score Inferences with Person-Fit Statistic and Verbal Reports: A Person-Fit Study for Cognitive Diagnostic Assessment. Educational Measurement: Issues and Practice, 32(1), 34-42.

Echenique, I. (2006). Matemáticas resolución de problemas. Recuperado de http://dpto.educacion.navarra.es/publicaciones/pdf/matematicas.pdf

Embretson, S., & Gorin, J. (2001). Improving Construct Validity Cognitive Psychology Principles. Journal of Educational Measurement, 38(4), 343-368.

Embretson, S. E. (1996). The New Rules of Measurement. Psychological Assessment, 8(4), 341-349.

Ercikan, K., Arim, R., Law, D., Domene, J., Gagnon, F., & Lacroix, S. (2010). Application of think aloud protocols for examining and confirming sources of differential item functioning identified by expert reviews. Educational Measurement: Issues and Practice, 29(2), 24-35.

Gamer, M., Lemon, J., Fellows, I., & Singh, P. (2015). Package irr: Various Coefficients of Interrater Reliability and Agreement (versión 0.84). Recuperado de https://cran.r-project.org/web/packages/irr/irr.pdf

Gorin, J. S. (2007). Reconsidering Issues in Validity Theory. Educational Researcher, 36(8), 456-462.

Hays, D. & Singh, A. (2011). Qualitative inquiry in clinical and educational settings. Estados Unidos: Guilford Press.

Jiménez, K., & Morales, E. (2009-2010). Validez predictiva del Promedio de Admisión de la Universidad de Costa Rica y sus componentes. Actualidades en psicología, 23-24 (110-111), 21-55.

Juan-Espinosa, M. d. (1997). Geografía de la Inteligencia Humana. Madrid: Pirámide.

Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33(1), 159-174.

Leighton, J. (2004). Avoiding misconception, misuse, and missed opportunities: The collection of verbal reports in educational achievement testing. Educational Measurement: Issues and Practice, 23(4), 6-15.

Leighton, J., & Gierl, M. (2007a). Verbal Reports as Data for Cognitive Diagnostic Assessment. En J. Leighton y M. Gierl (Eds.), Cognitive Diagnostic Assessment for Education (pp. 146-172). Estados Unidos: Cambridge University Press.

Leighton, J., & Gierl, M. (2007b). Defining and evaluating models of cognition used in educational measurement to make inferences about examinees’ thinking processes. Educational Measurement: Issues and Practice, 26(2), 3-16.

Leighton, J. P., & Sternberg, R. J., (2004). The Nature of Reasoning. United States: Cambridge University Press.

Markus, K., & Boorsboom, D. (2013). Frontiers of Test Validity: Measurement, Causation, and Meaning. Estados Unidos: Routledge.

Martínez Arias, M. R., Hernández Lloreda, M. J., & Hernández Lloreda, M. V. (2006). Psicometría. Madrid: Alianza Editorial.

Messick, S. (1995). Validity of Psychological Assessment. American Psychologist, 50(9), 741-749.

Ministerio de Educación Pública (2012). Reforma curricular en ética, estética y ciudadanía: Programas de estudio de matemática. Recuperado de http://www.mep.go.cr/sites/default/files/programadeestudio/programas/matematica.pdf

Molina, M., Gallardo, E., & Cordero, R. (2011). Tiempo para graduarse en la Universidad de Costa Rica: Factores Socioeconómicos y académicos asociados [Reporte de investigación del proyecto B1075]. Costa Rica: Universidad de Costa Rica.

Montero, E., Rojas, G., Negrín, M., & Francis, S. (2015). Efectos de los puntajes de la prueba de admisión de la Universidad de Costa Rica: una aproximación bayesiana. Actualidades en Psicología, 29 (119), 115-130.

Montero, E., Villalobos, J., & Valverde, A. (2007). Factores institucionales, pedagógicos, psicosociales y sociodemográficos asociados al rendimiento académico en la Universidad de Costa Rica: Un análisis multinivel. RELIEVE, 13(2), 215-234.

Polya, G. (1987). Cómo plantear y resolver problemas. México: Trillas, S.A.

Programa Estado de la Nación (2013). Cuarto informe Estado de la Educación. Costa Rica: Programa Estado de la Nación. Recuperado de http://www.estadonacion.or.cr/informe-iv-estado-educacion?highlight=WyJjdWFydG8iXQ==

R Core Team (2013). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. URL: http://www.R-project.org/.

Rico, L. (2006). Marco teórico de evaluación en PISA sobre matemática y resolución de problemas. Revista de Educación, extraordinario 2006, 275-294.

Rojas, L. (2013). Validez predictiva de los componentes del promedio de admisión a la Universidad de Costa Rica utilizando el género y el tipo de colegio como variables control. Actualidades Investigativas en Educación,13(1), 45-69.

Taylor, K., & Dionne, J. (2000). Accessing problem-solving strategy knowledge: The complementary use of concurrent verbal protocols and retrospective debriefing. Journal of Educational Psychology, 92(3), 413-425.