November 2004: Who's Keeping Tabs on Global Tests?

$85 million on international tests? It's a sponsorship-scandal-sized waste
November 1, 2004

One day my daughter came home from high school saying the teachers were really angry at her and a few of her equally high-achieving friends for skipping a test that “didn’t even count.”  

“Must have counted for something,” I said.

“Oh, it’s got to do with international scores, I didn’t get it,” was the answer as she went out the door.

The test in question was probably the one named after a dangerously misaligned tower—PISA ( Program for International Student Assessment). PISA is a project of the Organization for Economic Co-operation and Development (OECD) implemented in Canada by HRSD (formerly HRDC), the Council of Ministers of Education Canada (CMEC), Statistics Canada, and provincial ministries of education.   

PISA tests reading, mathematics and science skills across a wide spectrum of industrialized nations, from Thailand to Tunisia. In the last round of tests (2003), approximately 30,000 Canadian 15-year-old students from more than 1,000 schools participated. The test is developed by a team of external experts, based on complex negotiations among 32 countries to accommodate for social, cultural, and language differences. It’s a process that can lead to bizarre outcomes, according to one testing critic, who reported that the only form of writing experts could agree carried across cultures was the writing of computer manuals.  

“Presumably incomprehensible in any language,” says Canadian educational researcher and testing critic Larry Kuehn.

Well, it’s hardly surprising that a group of Canadian high school students wouldn’t “get it.”  I didn’t really get it, either, until I saw one of the background documents for a federally organized National Summit on Innovation and Learning held in Toronto a few years ago. Then I realized that international comparisons of high school students are a federal wedge into elementary secondary education in Canada—always a closely guarded provincial preserve.

The document lists four milestones for children and youth in the federal government’s innovation strategy. The first one is that Canada become “one of the top three countries in mathematics, science, and reading achievement.” The PISA tests might not have counted for my daughter and her friends, but there were clearly lots of bureaucrats in lots of countries counting on them to write these international tests.     

According to various news reports in 2001-2002, Canada’s “national education strategy,” or lack thereof, has been a matter of federal concern for some time. On January 28, 2003, a news article reported that the federal government was preparing a “back door answer to a national education strategy” in the form of a new clearinghouse for ideas, research, and trends about learning from pre-school to retirement. But do we not already have reams of data from years of provincial, local and national testing and surveying of Canadians at all levels? Not according to the OECD, says Dr. Ron Saunders of the Canadian Policy Research Network (CPRN). Despite an excellent showing in the 2000 PISA tests where Canadian students came second in reading, sixth in math, and fifth in science, the OECD has criticized Canada for numerous educational transgressions, including  “lack of coordination across jurisdictions” and “lack of a learning culture.”

In order to plug the gaps identified by the OECD, $100 million was set aside in the spring 2003 budget for the Canadian Council on Learning (previously called the Canadian Learning Institute). According to the Movement for Canadian Literacy, which has done some “digging” to find out about the role and status of this organization, “the Council will fund tools to measure progress in learning outcomes and to support testing and analysis of different approaches.” From the National Summit documents we know that PISA is at the top of the list of these tools.

Now, $100 million (recently reduced to $85 million) is a sizeable chunk of federal cash to be shuffled off to a corner where it can function in virtual obscurity. It’s equal to the amount deemed misappropriated in the Great Canadian Sponsorship Scandal and there’s not a Canadian flag in sight.  At least the sponsorship scandal made Canadians happy by sponsoring local festivals to help us celebrate our short but glorious summers. But international testing? It provides bragging rights for government delegations at international meetings and induces national hand-wringing when we don’t get enough of the gold medals, but to our students it’s just another test, and one they don’t feel they have a stake in. But now that we have an organization created specifically to make sure our students excel in them, it’s not likely that these tests are going to go away.   

Like the OECD itself, the Canadian Council on Learning exists in the unaccountable world of semi-governmental organizations. It is all set up to collect cradle-to-grave information on Canadians and their learning habits and completely out of reach of any of the usual checks and balances that govern institutions inside the democratic system. The potential amount of personal information that could be stored there is enormous. International tests, national tests, questionnaires and surveys on children and youth. Will these all end up under the Council’s umbrella?

The Romans had a warning for us: Quis custodiat ipsos custodes? Who is watching the watchman?  Who is watching the Canadian Council on Learning?  

(Marita Moll is a CCPA Research Associate and the editor of Passing the Test: the False Promises of Standardized Testing. She manages a website on standardized testing in Canada: