Missing Data in University Entrance Exams: Theoretical Bases and Evidence Based on Real Data

Document Type : Original Article

Authors

10.22034/emes.2019.36115

Abstract

Missing data that refer to any non-answering to items is a common phenomenon of empirical studies, educational and psychological assessments. Different statistical methods for dealing with nonresponse data are affected by the mechanism of missing-ness, their causes and their extent. The purpose of this article is exploring and describing the missing data in the university entrance national exam. Data for General Persian Literature Test of mathematical, Empirical and humanity Science fields and professional tests including literature in humanity filed, biology in empirical field and mathematics in math field in 1383, 1391 and 1395 (Solar years) used for this purpose. Analysis was done by SPSS, R and ‘psych’ package in R. It is shown the amount of missing data has increased in the years under review (min 2.2% and max 91.6%). Under amusingness condition item difficulties are overestimated. It is also shown that there is a high positive correlation between the number of non-answered items in different test of the same year (rmin = 0.41 and rmax=0.78); and the high negative correlation between the number of correct responses and missing answers (rmin = -0.56 and rmax =-0.85). The results of show occurrence of missing data in various competence dimensions and non-ignorable of missing data in statistical analyzes. It is necessary to select and use appropriate method for missing data when analyzing data for university entrance national exam.

Highlights

 

 

Keywords


Cohen, J. (1988). Statistical power analysis for the behavioral sciences. Hillsdale, NJ: Lawrence Erlbaum Associates.
Conrad, H. (1948). Characteristics and Uses of Item Analysis Data. Psychological Monographs, General and Applied.
Culbertson, M. (2011, April). Is it wrong? Handling missing responses in IRT. Speech presented at the Annual Meeting of the National Council on Measurement in Education. New Orleans, LA.
Enders, C. K. (2010). Applied Missing Data Analysis. New York and London: Guilford Press.
Fitzmaurice, G. M.; Davidian, M.; Verbeke, G. & Molenberghs, M. (2008). Longitudinal data analysis. London: Chapman & Hall.
Frey, A.; Hartig, J. & Rupp, A. A. (2009). An NCME instructional module on booklet designs in large-scale assessments of student achievement: Theory and practice. Educational Measurement: Issues and Practice, 28 (3), 39–53. Doi: 10.1111/j.1745 -3992.2009.00154.x
Glas, C. A. W. & Pimentel, J. (2008). Modeling nonignorable missing data in speeded tests. Educational and Psychological Measurement, 68, 907-922.
Graham, J. W.; Taylor, B.; Olchowski, A. & Cumsille, P. (2006). Planned missing data designs in psychological research. Psychological Methods, 11 (4), 323–343. Doi: 10.1037/1082-989X.11.4.323
Graham, J. W. (2009). Missing data analysis: Making it work in the real world. Annual review of psychology, 60 (1), 549–576. Doi: 10.1146/annurev.psych.58.110405.085530
Holman, R. & Glas, C. A. W. (2005). Modelling non-ignorable missing-data mechanisms with item response theory models. British Journal of Mathematical & Statistical Psychology, 58, 1-17.
Ibrahim, J. G. & Molenberghs, M. (2009). Missing data methods in longitudinal studies: a review. NIH, 18, 1-43.
Köhler, C.; Pohl, S. & Carstensen, C. (2015). Investigating mechanisms for missing responses in competence tests. Psychological Test & Assessment Modeling, 57 (4), 499-522
Koretz, D.; Lewis, E.; Skewes-Cox, T. & Burstein, L. (1993). Omitted and not-reached items in mathematics in the 1990 National Assessment of Educational Progress (CRE Technical Report 347). Los Angeles: Center for Research on Evaluation, Standards, and Student Testing (CRESST).
Lord, F. M. (1980). Applications of item response theory to practical testing problems. Hillsdale, NJ: Erlbaum.
Ludlow, L. H. & O’leary, M. (1999). Scoring omitted and not-reached items: Practical data analysis implications. Educational & Psychological Measurement, 59, 615-630.
McKnight, P. E.; McKnight, K. M. & Figueredo, A. J. (2007). Missing data: A gentle introduction. New York: Guilford Press.
Mislevy, R. J. & Wu, P. K. (1996). Missing responses and IRT ability estimation: Omits, choice, time limits, and adaptive testing (ETS Research Rep. no. RR-98-30-ONR). Princeton, NJ: Educational Testing Service.
Mullis, I. V. S.; Martin, M. O. & Diaconu, D. (2004). Item analysis and review. In M. O. Martin, I. V. S. Mullis, & S. J. Chrostowski (Eds.), TIMSS 2003 technical report (pp. 225-252). Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.
Organisation for Economic Co-operation and Development. (2009). Pisa 2006 technical report. Paris, France: Author.
Osterland, S. J. (1990) toward a uniform definition of a test item. Educational Research Quarterly, 14, 2–5
Pohl, S.; Grafe, L. & Rose, N. (2014). Dealing with Omitted and Not-Reached Items in Competence Tests: Evaluating Approaches Accounting for Missing Responses in Item Response Theory Models. Educational & Psychological Measurement, 74 (3) 423–452.
Pohl, S.; Haberkorn, K.; Hardt, K. & Wiegand, E. (2012). Technical Report of Rreading –Scaling results of Starting Cohort 3 in fifth grade (NEPS Working Paper No. 15). Bamberg: Otto-Friedrich-Universität, Nationales Bildungspanel.
Reckase, Mark D. (2009). Multidimensional Item Response Theory.
Rose, N.; Von Davier, M. & Xu, X. (2010). Modeling nonignorable missing data with item response theory (IRT) (ETS Research Rep. no. RR-10-11), Princeton, NJ: Educational Testing Service.
Rose, N. (2013). Item nonresponses in educational and psychological measurement (Unpublished doctoral dissertation). Friedrich-Schiller-University of Jena, Germany.
Rubin, D. B. (1987). Multiple Imputation for Nonresponse in Surveys. New York: John Wiley & Sons.
Rubin, D. B. (2004). Multiple imputations for nonresponse in surveys. Hoboken, NJ: Wiley-Interscience.
Schmidt, W. H.; Wolfe, R. G. & Kifer, E. (1993). The identification and description of student growth in mathematics achievement. In L. Burstein (Ed.), The IEA study of mathematics III: Student growth and classroom processes (pp. 59–99). New York: Pergamon Press.
Thomas, N.; Raghunathan, T.; Schenker, N.; Katzoff, M. & Johnson, C. (2006). An evaluation of matrix sampling methods using data from the national health and nutrition examination survey. Survey Methodology, 32 (2), 217.
Zhang, J. (2013). Relationships between missing responses and skill mastery profiles of cognitive diagnostic assessment. Doctoral dissertation. University of Toronto.