The Learning Tower of PISA

Author(s): 
December 18, 2004

The recent release of results by the Project International Student Assessment (PISA) is always a subject for hand-wringing and apologetics on the part of Atlantic Canadian educators and bluster and recriminations from groups like the Atlantic Institute for Market Studies.  As usual, Atlantic Canada’s performance is mediocre by Canadian national standards and somewhere in the middle of the pack internationally along with countries like Germany, Denmark and Sweden.  Canada as a whole and particularly Alberta (and to a lesser extent, Ontario and British Columbia) leads most of the world on these assessments causing us all to wonder what they are doing right and what we are doing wrong.

Well, first of all Alberta graduates significantly fewer of its students.  According to data from the Canadian Council of Ministers of Education published in 2003, Alberta’s graduation rate is about 10% below that of Nova Scotia.  Incidentally the difference between Nova Scotia’s PISA scores and those of Alberta is a bit less than 10%.  By having a more exclusive high school system Alberta “adjusts” underperforming students out the school door and into the workforce.  As it happens, Alberta has an economy that can absorb a considerable amount of educational underachievement.  Here in Nova Scotia, we do not have that luxury.

This year Ontario also scored very well on the PISA assessments.  However, there seems to be a problem with the Ontario data.  In Ontario 47% of the schools selected by the PISA project declined to participate in the assessment.  The report offers no explanation for this other than to say that the schools that refused to participate did not skew the demographics of the study in terms of rural/urban, French/English, school size and private/public characteristics of the sample.  It is clear that in order to understand Ontario’s results we need to know something about the 84 schools that refused to participate in the PISA assessment.  Participation rates in other provinces ranged between 93 and 100% of selected schools.  The three Atlantic Canadian provinces again led the nation in compliance at 98.8 to 100%. 

At the national level, Ontario’s refusal rate dropped the percentage of Canadian schools participating in the assessment below the 85% threshold considered by the PISA to be the minimum acceptable standard.  In the international data some similar problems arise.  In the United States, nearly 42% of schools selected refused to participate in the PISA and in the United Kingdom the non-response rate was around 33%.  The results of the UK sample are not considered by the PISA to be comparable to other results.  Yet, what are we to conclude about the 133 out of 382 American schools that declined participation in the PISA study?  Again, which schools refused?  Why?

Finally, I wonder who in this country we should expect to find at the bottom of the statistical heap when it comes to education.  I expect the provinces with the lowest incomes, the highest percentage of adults who have not completed high school, and the lowest levels of school funding (Nova Scotia is at the bottom in Canada).  This is indeed the way it works out.  And who would we expect to find at the top?  We find what we should expect to find: the provinces with the highest average and family incomes, the lowest percentage of adults who have no postsecondary education, and the provinces that invest the most in education.  Once again we are treated to no surprises in the PISA results.

Here in Atlantic Canada we seem to be intent on joining the mad scramble to become as test-driven as Ontario and Alberta.  We can reorient our school curricula and teaching methods in the way these provinces have and turn our schools back into drill and practice shops.  This may indeed improve test scores but will it improve educational quality?  We could also perform manipulations similar to those I have described above to improve the numbers.  Simply adjust the sample and the scores will improve.  It is my view that by doing these kinds of things we will diminish the inclusiveness and quality of our schools.  At a broader level I think we need to move beyond questions about how well we did on a particular one-shot test to questions of evaluation or a consideration of what we value in the way of educational experiences and outcomes for our young people.  

If we have the political will we can follow Finland, the international winner of the PISA sweepstakes, by trusting teachers to develop appropriate curriculum in nonstandard ways and orienting our scarce resources toward richer forms of school-based student assessment.  This would allow us to be able to buck national and international trends toward standardization of assessment and increase support for classroom-based educational assessment that allows our children to show what they can do and assess how they are progressing as individuals rather than how well they can answer the OECD’s standard question sets.  Tests like the PISA tell us a lot less about what our children know than a well-structured 30 minute face-to-face conversation could establish.  It seems to me we need a lot more conversations about what our children know.  Good classroom teachers call this assessment and families whose children tend to succeed in school call this caring, involvement or support.  These classroom and family conversations are much more important than the results gathered on a standardized form devised by people who have no significant interaction with children and youth, parents, or teachers.  We should certainly take the results of such exercises with a considerable grain of salt and decide whether we will follow Finland or Alberta.

Mike Corbett teaches at the School of Education, Acadia University and is a research associate, Canadian Centre for Policy Alternatives.

Offices: 
Issue: