|
Assessment, School Community

Ranking Our Responses to PISA 2009

In 1997 the Organization for Economic Cooperation and Development (OECD) partnered with countries around the world to design the ambitious and innovative Programme for International Student Assessment (PISA). Beginning in 2000, and every three years since, OECD/PISA has assessed 15-year-old students in participating countries to gauge the extent “to which youth have acquired some of the knowledge and skills essential for full participation in modern societies.”[1] 

PISA results have been an important part of my career. I am not positioned to do much with them, but I value the perspectives they contribute to my understanding of education, within Canada and abroad. And so I looked forward to the new round of reports; but when they arrived, I was more captivated by the public response that unfolded around them than by the results themselves.

As we all chimed in with our interpretations, I couldn’t help wondering if we were clear about what the results were (or were not) telling us.

I’d like to put some of the blame for public reaction to PISA scores on the OECD, itself. It’s easy to feel intimidated by the volume of figures and explanations that flow from each assessment. But this alone cannot explain the overwhelming amount of attention paid to a single, league-style table ranking the 65 participating countries on combined reading, mathematic, and scientific literacy scores. Witnessing how results get taken up in the public domain, it is hard not to feel that the PISA country rankings have become the Olympics of the education world.

These international comparisons can be valuable, of course. Within policy circles, PISA has provided a context for new learning about factors that may contribute to successful school systems. This year, authors of the report profiled a number of countries whose results show notable improvements, including Germany, which was “jolted into action when PISA 2000 revealed a below-average performance and large social disparities in results.” Since then, they have made significant gains on both fronts.[2]

As often seems to happen with the release of any rankings, however, comparisons slip into competition. Public discussion of rankings becomes particularly alarming when it is played out to the detriment of young people (e.g., “our country’s ranking would be higher if only we had the ‘right’ type of students”) or when nations endure weeks of stereotypical comments serving to diminish high rankings or justify low ones.

Public fascination with international rankings also overshadows other important comparative results presented in PISA reports. Take, for example, the fact that differences between countries represent only a fraction of overall variation in student performance when compared to differences within countries, which can represent gaps equivalent of multiple years of schooling between the lowest and highest performing students.[3]

Often also left out of the dialogue is the fact that countries vary in the extent to which high performance is accompanied by equity of educational outcomes for all young people. Writing about Canada’s performance in PISA 2009, Christa Freiler (Director of Research and Strategic Initiatives for the Canadian Education Association) notes that the equity factor is, “arguably, more important to the social and economic future of young people and Canada as a whole than small changes in overall standing (i.e. whether we are 3rd, 4th or 5th).”[4]

Finally, as PISA becomes a trusted source of information on educational quality, the public needs to understand an important qualification: PISA does not assess students’ knowledge or understanding of school subjects. Its results are, in fact, a measure of the cumulative impact of a young person’s formal and informal learning and the extent to which this can be demonstrated through the application to “real life” reading, math, and science scenarios. PISA does not assess students’ achievement of curriculum outcomes, and results cannot be attributed to schools alone.

This important qualification does not limit the value of PISA; there is much to learn from data designed to tell us how well young people are prepared to “fully participate in modern society.” However, as Sjoberg reminds us, we need to “discuss and use the results with some insight…we need to know what we might learn from the study, as well as what we cannot learn. Moreover we need to raise a critical (not necessarily a negative) voice in public [and] professional debates over the uses and misuses of the results.”[5]


[1] T. Knighton, P. Brochu, and T. Gluszynski, Measuring Up: Canadian Results of the OECD PISA Study (Ottawa: Statistics Canada, Council of Ministers of Education Canada, and Human Resources and Skills Development Canada, 2010), 39.

www.statcan.gc.ca/pub/81-590-x/81-590-x2010001-eng.pdf

[2] PISA 2009 Results: Overcoming Social BackgroundEquity in Learning Opportunities and Outcomes, vol. 2 (OECD, 2010): 4.

[3] Knighton, Brochu, and Gluszynski, 157.

[4] C. Freiler, PISA 2009: Let’s Not Underestimate the Importance of Equity in Education.

www.cea-ace.ca/blog/christa-freiler/2010/12/3/pisa-2009-let’s-not-underestimate-importance-equity-education

[5] S. Sjoberg, “PISA and ‘Real Life Challenges’: Mission Impossible?” in PISA According to PISA, eds., Hopmann and Brinek (University of Vienna: Wien Lit-Verlag, October, 2007), 2.

Meet the Expert(s)

Jodene Dunleavy

Jodene Dunleavy

Jodene Dunleavy is a Senior Policy Analyst for the Nova Scotia Department of Education.

Jodene Dunleavy est analyste principale des politiques au ministère de l’Éducation de la Nouvelle-Écosse.

Read More

1/5 Free Articles Left

LOGIN Join The Network