Indiana University SPEA Edward J. Bloustein School of Planning and Public Policy University of Pennsylvania AIR American University

Panel Paper: When Students Don't Care: Reexamining International Differences in Achievement and Non-Cognitive Skills, Using Novel Measures of Student Effort on Surveys and Tests

Saturday, November 14, 2015 : 9:10 AM
Jasmine (Hyatt Regency Miami)

*Names in bold indicate Presenter

Collin E. Hitt1, Gema Zamarro1 and Ildefonso Mendez2, (1)University of Arkansas, (2)University of Murcia
Policy debates are often framed in terms of regional and international differences on standardized test scores. The obvious presumption is that these differences reflect differences in cognitive skills and general content knowledge, the things which achievement tests are designed to measure. We challenge this presumption, by demonstrating that a substantial amount of the within-country and international variation in PISA test scores is associated with non-cognitive skills and student effort on the tests themselves.

Our work pilots and refines several new behavioral measures of noncognitive skills, derived from student answer patterns on tests and surveys. In short, we treat tests and surveys as tasks representative of everyday schoolwork, and we attempt to measure effort on those tasks. For example, we examine the frequency with which students skip questions on surveys,show diminished effort toward the end of tests, or are able to recover from a rocky start at the beginning of the test. We use these task-based measures of effort as proxies for noncognitive skills, such as conscientiousness and resilience. The proxy measures we employ contain meaningful new information about student noncognitive skills.

Noncognitive skills research relies heavily on student self-reported scales. Per our previous research, survey answer patterns also contain information about the effort each student puts into the survey. Item response rates and haphazard answer patterns are predictive of educational attainment, independent of cognitive ability – demonstrating that these measures are valid and important measures of noncognitive skills.

Standardized test scores reflect more than student learning, they reflect the character traits of students taking the tests. As designed, test scores provide valuable but imperfect information on student cognitive abilities. But testing data can also contain information about the effort that each student put forward on the test. Research by ourselves and others has demonstrated that student effort on PISA tests can be credibly quantified.

We examine the extent to which student effort on PISA cognitive tests is correlated with student effort on accompanying surveys. This novel analysis provides further validation of test-taking effort as a measure of non-cognitive skills. We calculate international and regional differences test-effort and survey-effort, using our new measures, which we argue proxy as measures of noncognitive skills. We then decompose international differences in test scores based upon our novel measures of noncognitive skills.

The policy implications of international and regional gaps in test scores are based in large part on what test scores are seen to represent. Our work examines the extent to which these differences in test scores are really driven by differences in math, science and literacy skills, rather than by differences of another sort – differences in the effort that students put forward during the measurement process. We argue, based upon our previous work, that the effort that students put forward on tests and surveys tells us how students approach the routine tasks of school and work. Thus international test score differences, we conclude, in fact reflect differences in noncognitive skills, rather than simple academic content knowledge.