داده‌های گمشده در آزمون‌های سراسری ورود به دانشگاه: مبانی نظری و شواهد مبتنی بر داده‌های واقعی

نوع مقاله : مقاله پژوهشی

نویسندگان

1 دانشجوی دکتری سنجش و اندازه‌گیری، دانشگاه تهران، دانشکده روان‌شناسی و علوم تربیتی

2 دانشیار دانشگاه تهران، دانشکده روان‌شناسی و علوم تربیتی

3 استاد دانشگاه خوارزمی

4 استادیار دانشگاه خوارزمی

10.22034/emes.2019.36115

چکیده

داده‌های گمشده پدیده رایج مطالعات تجربی و سنجش‌های آموزشی و روان‌شناسی هستند که به هرگونه بدون پاسخ ماندن سؤال اطلاق می‌شود. روش‌های آماری بسیاری برای محاسبات مجموعه داده‌های دارای پاسخ گمشده وجود دارد که تحت تأثیر مکانیسم گمشدگی، علل و میزان آنها هستند. هدف از نگارش این مقاله، بررسی وضعیت موجود داده‌های گمشده در آزمون سراسری ورودی دانشگاه‌هاست. با استفاده از روش‌های آمار توصیفی و نرم‌افزارهای SPSS و R نشان داده شد میزان داده‌های گمشده در سال‌های مورد بررسی افزایش یافته است (در دامنه‌ای بین 2.2% تا 91.6%) و شاخص‌های آماری آزمون تحت تأثیر میزان داده‌های گمشده قرار دارد. همچنین نشان داده شد همبستگی مثبت و بالایی بین تعداد پرسش‌های بدون پاسخ در درس‌های مختلف (r=0.41, 0.78)؛ و همبستگی منفی و بالایی بین تعداد پاسخ‌های درست و تعداد پاسخ‌های گمشده (r=-0.56, -0.85) وجود دارد. نتایج به دست آمده از این مقاله، نشان از ثبات رخداد داده‌های گمشده در ابعاد مختلف شایستگی و غیرقابل اغماض بودن داده‌های گمشده در تحلیل‌های آماری دارد.

کلیدواژه‌ها


عنوان مقاله [English]

Missing Data in University Entrance Exams: Theoretical Bases and Evidence Based on Real Data

نویسندگان [English]

  • Maryam Chegini 1
  • Ebrahim Khodaie 2
  • Valiollah Farzad 3
  • Balal Izanloo 4
1
2
3
4
چکیده [English]

Missing data that refer to any non-answering to items is a common phenomenon of empirical studies, educational and psychological assessments. Different statistical methods for dealing with nonresponse data are affected by the mechanism of missing-ness, their causes and their extent. The purpose of this article is exploring and describing the missing data in the university entrance national exam. Data for General Persian Literature Test of mathematical, Empirical and humanity Science fields and professional tests including literature in humanity filed, biology in empirical field and mathematics in math field in 1383, 1391 and 1395 (Solar years) used for this purpose. Analysis was done by SPSS, R and ‘psych’ package in R. It is shown the amount of missing data has increased in the years under review (min 2.2% and max 91.6%). Under amusingness condition item difficulties are overestimated. It is also shown that there is a high positive correlation between the number of non-answered items in different test of the same year (rmin = 0.41 and rmax=0.78); and the high negative correlation between the number of correct responses and missing answers (rmin = -0.56 and rmax =-0.85). The results of show occurrence of missing data in various competence dimensions and non-ignorable of missing data in statistical analyzes. It is necessary to select and use appropriate method for missing data when analyzing data for university entrance national exam.

کلیدواژه‌ها [English]

  • Missing Data
  • Missing Pattern
  • Missing Mechanism
  • Academic Achievement Tests
Cohen, J. (1988). Statistical power analysis for the behavioral sciences. Hillsdale, NJ: Lawrence Erlbaum Associates.
Conrad, H. (1948). Characteristics and Uses of Item Analysis Data. Psychological Monographs, General and Applied.
Culbertson, M. (2011, April). Is it wrong? Handling missing responses in IRT. Speech presented at the Annual Meeting of the National Council on Measurement in Education. New Orleans, LA.
Enders, C. K. (2010). Applied Missing Data Analysis. New York and London: Guilford Press.
Fitzmaurice, G. M.; Davidian, M.; Verbeke, G. & Molenberghs, M. (2008). Longitudinal data analysis. London: Chapman & Hall.
Frey, A.; Hartig, J. & Rupp, A. A. (2009). An NCME instructional module on booklet designs in large-scale assessments of student achievement: Theory and practice. Educational Measurement: Issues and Practice, 28 (3), 39–53. Doi: 10.1111/j.1745 -3992.2009.00154.x
Glas, C. A. W. & Pimentel, J. (2008). Modeling nonignorable missing data in speeded tests. Educational and Psychological Measurement, 68, 907-922.
Graham, J. W.; Taylor, B.; Olchowski, A. & Cumsille, P. (2006). Planned missing data designs in psychological research. Psychological Methods, 11 (4), 323–343. Doi: 10.1037/1082-989X.11.4.323
Graham, J. W. (2009). Missing data analysis: Making it work in the real world. Annual review of psychology, 60 (1), 549–576. Doi: 10.1146/annurev.psych.58.110405.085530
Holman, R. & Glas, C. A. W. (2005). Modelling non-ignorable missing-data mechanisms with item response theory models. British Journal of Mathematical & Statistical Psychology, 58, 1-17.
Ibrahim, J. G. & Molenberghs, M. (2009). Missing data methods in longitudinal studies: a review. NIH, 18, 1-43.
Köhler, C.; Pohl, S. & Carstensen, C. (2015). Investigating mechanisms for missing responses in competence tests. Psychological Test & Assessment Modeling, 57 (4), 499-522
Koretz, D.; Lewis, E.; Skewes-Cox, T. & Burstein, L. (1993). Omitted and not-reached items in mathematics in the 1990 National Assessment of Educational Progress (CRE Technical Report 347). Los Angeles: Center for Research on Evaluation, Standards, and Student Testing (CRESST).
Lord, F. M. (1980). Applications of item response theory to practical testing problems. Hillsdale, NJ: Erlbaum.
Ludlow, L. H. & O’leary, M. (1999). Scoring omitted and not-reached items: Practical data analysis implications. Educational & Psychological Measurement, 59, 615-630.
McKnight, P. E.; McKnight, K. M. & Figueredo, A. J. (2007). Missing data: A gentle introduction. New York: Guilford Press.
Mislevy, R. J. & Wu, P. K. (1996). Missing responses and IRT ability estimation: Omits, choice, time limits, and adaptive testing (ETS Research Rep. no. RR-98-30-ONR). Princeton, NJ: Educational Testing Service.
Mullis, I. V. S.; Martin, M. O. & Diaconu, D. (2004). Item analysis and review. In M. O. Martin, I. V. S. Mullis, & S. J. Chrostowski (Eds.), TIMSS 2003 technical report (pp. 225-252). Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.
Organisation for Economic Co-operation and Development. (2009). Pisa 2006 technical report. Paris, France: Author.
Osterland, S. J. (1990) toward a uniform definition of a test item. Educational Research Quarterly, 14, 2–5
Pohl, S.; Grafe, L. & Rose, N. (2014). Dealing with Omitted and Not-Reached Items in Competence Tests: Evaluating Approaches Accounting for Missing Responses in Item Response Theory Models. Educational & Psychological Measurement, 74 (3) 423–452.
Pohl, S.; Haberkorn, K.; Hardt, K. & Wiegand, E. (2012). Technical Report of Rreading –Scaling results of Starting Cohort 3 in fifth grade (NEPS Working Paper No. 15). Bamberg: Otto-Friedrich-Universität, Nationales Bildungspanel.
Reckase, Mark D. (2009). Multidimensional Item Response Theory.
Rose, N.; Von Davier, M. & Xu, X. (2010). Modeling nonignorable missing data with item response theory (IRT) (ETS Research Rep. no. RR-10-11), Princeton, NJ: Educational Testing Service.
Rose, N. (2013). Item nonresponses in educational and psychological measurement (Unpublished doctoral dissertation). Friedrich-Schiller-University of Jena, Germany.
Rubin, D. B. (1987). Multiple Imputation for Nonresponse in Surveys. New York: John Wiley & Sons.
Rubin, D. B. (2004). Multiple imputations for nonresponse in surveys. Hoboken, NJ: Wiley-Interscience.
Schmidt, W. H.; Wolfe, R. G. & Kifer, E. (1993). The identification and description of student growth in mathematics achievement. In L. Burstein (Ed.), The IEA study of mathematics III: Student growth and classroom processes (pp. 59–99). New York: Pergamon Press.
Thomas, N.; Raghunathan, T.; Schenker, N.; Katzoff, M. & Johnson, C. (2006). An evaluation of matrix sampling methods using data from the national health and nutrition examination survey. Survey Methodology, 32 (2), 217.
Zhang, J. (2013). Relationships between missing responses and skill mastery profiles of cognitive diagnostic assessment. Doctoral dissertation. University of Toronto.