Does New Zealand Have a Long Tail of School Underachievement?
Getting to the Truth of the Matter A/Prof John Clark
School of Educational Studies, Massey University, Palmerston North – j.a.clark@massey.ac.nz
Politicians, policy makers, teachers and academics alike often make reference to New Zealand having a long tail of school underachievement. For example, the ACT party (2013) refers to it on their website, the Treasury (2012) does so in various publications, the New Zealand Principals’ Federation (2013) mentions it, as does the NZEI (2013) and various academics including Hornby (Brook, 2013) from the University of Canterbury and Chapman and Tunmer (Shadwell, 2013) from Massey University. The Treasury (2012) put the matter succinctly: “New Zealand has a wide distribution of educational achievement and more low performing students compared to other countries with a similarly high average score in international tests” (p.1). There is, however, some disagreement about whether the long tail of school underachievement actually exists. This is particularly evident in two reports published in 2013 by academics at Massey University. In July, Tunmer, Chapman, Greaney, Prochnow and Arrow (2013) from the Institute of Education drew attention to “the growing body of evidence of New Zealand’s relatively ‘long tail’ of literacy underachievement” (p.1). A few months later, in September, a report produced by Snook, O’Neill, Birks, Church and Rawlins for the Education Policy Response Group (2013), also located in the Institute of Education, took the Treasury’s position as its starting point: “The claim is that New Zealand has a wider distribution and more low performing students than countries with similar average scores” (p.28) and on the basis of the empirical evidence gathered concluded thus: “it is clearly not true that New Zealand has a serious issue of low achievement in reading literacy” (p.29). The Policy Group relied on the 2009 PISA data while the Literacy Group drew on the more recent 2011 PIRLS data to bolster their respective cases. Both reports cannot be true. Either we have a long tail in school achievement or we do not. If we are to find the truth of the matter we will need to look closely at the empirical evidence contained in three international reports on literacy which New Zealand participated in: the 2011 Progress in International Reading Literacy Study (PIRLS), the 2012 Programme for International Student Assessment (PISA) and the 2008 Adult Literacy and Life Skills (ALL) Survey. PIRLS Evidence The 2011 Progress in International Reading Literacy Study (PIRLS) (International Association for Evaluation of Educational Achievement, 2012) reported the results of a survey of reading achievement in 49 participating countries around the world. Table 1.1 in Chapter 1 (p38) of the original PIRLS report
indicates the following data for those countries with similar average scores to NZ (531): Slovak Republic Bulgaria NZ Slovenia Austria Lithuania Australia
From the accompanying bar graph of the distribution of reading achievement it is evident that NZ has: ■■ ■■
the widest distribution of all 7 countries more low performing students than the other 6 countries
So, on the basis of the PIRLS data the evidence supports the claim that New Zealand has a wider distribution and more low performing students than countries with similar average scores. PISA 2009 and 2012 PISA 2009 The Education Policy Response Group report makes use of the earlier 2009 PISA report on literacy achievement and their use of the 2009 data will be discussed here. However, since their report the 2012 PISA results have been published which shed additional light on the issue of whether New Zealand has a long tail of school underachievement. The Ministry of Education (2010) report on PISA 2009 contains both narrative with data, Figure 3 with proficiency levels and Figure 4 with distribution ranges. The Ministry report highlights 5 countries, NZ and 2 just above and 2 just below the NZ score. The data (%) from the narrative has been extracted. Score
1b and below
less than 2
4–6
5–6
Singapore
Canada
NZ
Japan
Australia
(The Education Policy report includes USA (500) and UK (494) which are far below the NZ mean but excludes Singapore. The gaps for Canada are not readily apparent from the narrative although the Education Policy Table 1 reports a 2 for each). Figure 4 presents the distribution of scores, in bar graph form, with the 5 above-named countries highlighted. Of the 5 countries, NZ has:
■■ ■■
the widest distribution of scores between 95 – 5 percentile. the widest distribution of scores between the 25 – 5 percentile exceeded only by Japan.
So, on the basis of the PISA data the evidence supports the claim that New Zealand has a wider distribution and more low performing students than countries with similar average scores. The Ministry of Education (2013) report, PIRLS 2010/1: An Overview of Findings From the Third Cycle of PIRLS, contains the following statement and accompanying footnote: “All three cycles of PIRLS have highlighted the relatively large range in the reading performance of NZ’s Year 5 students. It is important to note that the wide range is not just due to the relatively weak performance of some of NZ’s students; it also highlights the fact that NZ has very high performing students – a big ‘tail’ AND a big ‘nose’”. FN: “This pattern has been observed in NZ’s PISA reading literacy results.” PISA 2012 The 2012 PISA (OECD, 2013) report focused on science with a lesser emphasis on reading, but the reading data is nonetheless helpful in providing further evidence on the issue of the long tail of school underachievement. Figure 1.4.1 in Chapter Four on reading revealed that New Zealand had a mean score of 512 and so was ranked 13th out of 64. Countries whose mean scores are not statistically significantly different from New Zealand were identified as follows: Poland Estonia Liechtenstein
NZ Australia Netherlands Belgium Switzerland Macao-China Viet Nam Germany France
The 2012 PISA survey charted the trends in school system’s average performance across the 2000, 2003, 2006, 2009 and 2012 cycles (Figure 1.4.3). Of the 64 countries, 32 showed a positive annualised trend in mean reading achievement, 22 showed no change while 10 showed a deteriorating trend. New Zealand was one of 10 countries to show an annual decline in reading achievement between 2000 and 2012. Other listed countries showing a steady decline include Australia, Canada, Finland, Iceland, Slovenia, Sweden and Uruguay (Figure 1.4.4). Trends in the percentage of low and top performing students in reading were also identified, according to the following categories: (1) Moving everyone up – reduction in the share of low performers and increase in that of top performers; (2) Reducing underperformers – reduction in the share of low performers but no change in that of top performers; (3) Nurturing top performers – increase in the share of top performers but no change in that of low performers; (4) Increase in the share of low performers or decrease in that of top performers. New Zealand was placed in the last category – Figure 1.4.11 reveals a 4.8 drop in top performing students and also an increase in low performing students. In discussing the variation in student performance in reading, the report noted: At the other end of the spectrum, among the ten participating countries and economies that show the largest difference between the highest and lowest achievements in reading, this gap ranges from 270 to 310 points. As is true of those countries with a comparatively narrow distribution of scores among students, the group of countries with a wide range in performance is heterogeneous in mean reading proficiency. One of the lowest-performing countries, Qatar, has nearly the same gap between the highest and lowest achievers as the highest performing country, New Zealand, and both countries are included in this group (OECD, 2013, p.199). But there is more. New Zealand’s wide range of reading achievement in PISA between high and low achievers might be even worse than reported, especially with low performing students. It is possible that the number of low performing students is far larger than is apparent. Wellington Girls College principal Julia Davidson was reported as saying that “students volunteered their time to do the assessment” (Moir, 2013): it is possible that high performing students would choose to participate while low performing students may not, thus skewing results. An email from a colleague who met with officials from the Ministry of Education to find out whether the PISA data contains sampling errors, stated: By examining school enrolment data and census data they were able to determine that the exact proportion of students in the specific age range who were not enrolled in school when the PISA data were collected was 13 per cent. These were students who typically were poor achievers according to the demographic projections. This significant percentage
was in addition to suspensions, expulsions, stand-downs, exclusions and PISA target students (specified by the student-based as opposed to classroom- or school-based sampling procedure) absent on the day of the assessment. The latter included students who were absent because: ‘They were ill/sick; their parents did not give permission; they were suspended or stood down; they were truant; they were involved in other school activities (e.g., school trips or sports competitions); they were not released from other classes; they were at school but did not turn up to the PISA session (reason unknown); they were at school but chose not to participate; they were absent from school for an unknown reason’. One Ministry official recounted the following story about a test administrator who visited a low decile school with a list of students to be tested. When he went through the list of target students with the principal he was told such things as:
i1055_11/13
“We haven’t seen that student for three months; we don’t know the whereabouts of that student; that student is in prison.” Regarding the latter case, just as the test administrator was getting to administer the PISA, the principal told him that he was mistaken about the student who was in prison, as the student had returned to the school that very day. The principal directed the student to the test room but in the middle of going over the instructions for the PISA, the student raised his hand and asked the test administrator if he could go to the toilet. The former inmate never returned.
From the above, it was concluded that: The point is that in addition to the 13.1 per cent of the age cohort who were out of the school system when the PISA was administered, an additional 5–6 per cent of students enrolled in school were not tested for a whole range of reasons. This is different from the sampling plan for the adult literacy tests, which were census-based. And for PIRLS, most 9–10 year-olds are still in school. ALL 2006 The Ministry of Education (2008) report, The Adult Literacy and Life Skills (ALL) Survey: Overview and International Comparisons, provides the following information. In a comparison of the 1996 IALS and 2006 ALL results, the distributions of scores in Figure 2.2 (p.9) for prose literacy were: IALS ALL
360 – 165 350 – 190
In both cases the mean was around 275. In a comparison of international prose literacy between NZ, Canada and USA, the distributions of scores for both IALS and ALL in Fig 2.2 (p.18) revealed that: IALS/ALL means were NZ around 275, Canada around 285 and USA declined from 275 to 270. ■■ The 5 to 95 percentile range and the 25 to 75 percentile range were narrower for ALL compared to IALS with distributions closer to the mean. ■■ In the distributions, for both IALS and ALL, the USA had the widest distribution of scores, followed by Canada with NZ having the least. Although the numbers are not reported, from the Figure 3.2 graph the following ranges have been estimated. ■■
IAL Scores
ALL Range
Score
Range
USA
65–138
347–178
Canada
370–147
360–178
NZ
362–163
347–193
The adult literacy data seem less persuasive in terms of the wide distribution and more low performing adults. Causes and Solutions The PIRLS and PISA results provide rather compelling evidence to conclude that the Treasury claim – ‘New Zealand has a wide distribution of educational achievement and more low performing students compared to other countries with a similarly high average score in international tests’ – is well-founded. In which case, there is some explaining to do in terms of causes and solutions. We need to ask four questions: (1) Why is it that, compared to other countries with similar high average scores, New Zealand has such a wide distribution of school achievement and more low performing students? (2) Why is New Zealand’s achievement in reading declining in relation to other similar countries? (PISA 2012 Table 1.4.5 on reading performance indicates that, in relation to New Zealand, there were no countries with similar performance to New Zealand in 2000 who also had a lower performance in 2012, there was one country – Australia – with similar performance to New Zealand in 2000 which had similar performance to New Zealand in 2012, and there were five countries – Hong Kong, Japan, Canada, Ireland, Korea – with
Financial Reporting and Property Services specifically for schools. Gold SPonSoR nZPF Education Services provide an accounting service to over 600 schools throughout the North Island, and would love to welcome your school to our family. We have developed a new modern reporting suite and have online facilities second to none. No software is needed at the school, and still have all your information available to you via the cloud 24/7. Let us take all the worry out of financial management at your school. If you would like a “no obligation” quote to compare the cost of our service, or simply to see what we do and how we do it.
Please phone Pete on 06 757 5489 or email pete@educationservices.co.nz www.educationservices.co.nz
similar performances to New Zealand in 2000 who had higher performances in 2012. (3) Will current solutions narrow the wide distribution and reduce the number of low achievers? (4) If not, what will? Why is it that New Zealand has a wide distribution of school achievement and a large number of under-achieving students? Certainly, there is a preponderance of Pākeha and Asian students in the upper levels of achievement while Māori and Pasifika students tend to cluster more at the lower levels of performance. But since some Pākeha and Asian students perform poorly and some Māori and Pasifika students do well then, like the massive Spencer/Russell Sage Foundation’s study Whither Opportunity (Duncan & Murname, 2011), ethnicity seems to be less of a causal factor than family circumstances, particularly wealth and poverty. Indeed, the simple message of the study is that in the USA since WWII, as the gap in family income has widened, as the rich get richer and the poor become poorer, there has been a corresponding increase in the achievement gap at school with the children from wealthy families tending to do well while children from poor families tend to underachieve. This seems to be more than mere correlation but a matter of a causal relation. Conceptualising the set of causes has been based on a dualism of within school and beyond school factors, but this way of putting things has been particularly damaging. Certainly, some weight must be given to factors within schools but not to the exclusion of those beyond the school which may be even more significant. To focus on within school factors such as quality teachers with the view that this will, in the long term, close the achievement gap by raising the performance of those at the bottom, is noble in thought but misplaced in practice. To place faith in this side of the dualism is to ignore the profound effect that beyond school factors play in determining school achievement. Until these issues, such as the economic distribution of social goods, are addressed then very little will be done to eliminate the social causes of school underachievement, which include family poverty, poor child health, lack of family resources which support child learning (e.g., reading books to children) and the like. We would be wiser to reject the dualism and embrace a proximal/distal continuum where we look closely at those proximal causes close to the action (teaching of reading in school, hunger at home) which impact directly on the lives of children and which we must do something about with a high degree of immediacy. In addition, moving out towards the distal, also pay particular attention to altering those other conditions, whether within the school or beyond, which hold in place the structural inequality so central to the growing inequality of school achievement. In the final analysis, the causes are not either within or beyond the school but are a complex set of both which need very careful unravelling. Why is New Zealand’s reading achievement declining? If the results of PISA 2012 are any guide, New Zealand’s reading achievement has declined in two significant respects over the 2000/2012 period – domestically, achievement has trended down and internationally similar countries have displayed upward trends. One possible explanation is that over the decade parents spent less time teaching their children to read before they started school. Another explanation is that the pedagogical approach to the teaching of reading ( primarily whole language) at school is far less effective than once thought. A third possible cause is that the flaws inherent in the remedial programme of
Reading Recovery have become more evident and cannot stem the tide of growing reading underachievement. There may be other explanations besides. But what ever causes we posit, there is no getting away from the fact that the results of international testing point to a deeply unsatisfactory state of affairs which must be reversed. Current solutions seem to be misplaced for they focus on the within school factors with little regard to those other factors which play such an important part in generating and maintaining the inequality of school achievement. It is hard to see how national standards, charter schools, masters-level initial teacher preparation programmes, reading recovery, and all of the other within school solutions promoted by successive Ministers of Education and their Ministry of Education officials will have much impact on arresting the decline in school achievement as measured by international testing. Until such time as the full force of the state is brought to bear on tackling the wider social, political and economic conditions which reinforce structural inequality then there is little reason to think that things educational will improve on the international scene. In the meantime children continue to suffer and be seriously disadvantaged. Finally, what will work to shorten the long tail of school underachievement by reducing the number of students who underachieve? For starters, the following suggestions are made by Tucker (2013), CEO of the US National Centre on Education and the Economy, in response to the PISA 2012 results: Carefully study the strategies, policies and practices of the top countries, not to copy them but to take and adapt what they do to meet our own national needs. ■■ Provide more resources for students who are low achievers. ■■ Invest heavily in the skills of teachers, including high quality students into initial teacher preparation programmes, ensuring that teachers have adequate content understanding, and providing strong teacher preparation programmes. ■■ Put effort into building internationally competitive academic standards, rigorous curricula, and examinations that are based on the full range of complex thinking skills. ■■ Give young children and their parents support before the children start school. ■■ Build effective school systems which rely on government to implement these systems well. ■■
These points come with a warning for New Zealand: “You can look from one end of PISA reports to the other and find no correlation between student performance and use of market forces (charters and vouchers) in education systems. You will find no correlation between what a country or a city spends per student and the average student achievement in that country or city. Nor will you find any correlation between student achievement and the use of systems designed to hold teachers accountable for the performance of their students based on their scores on standardised tests. Which is to say that PISA provides no evidence whatsoever that any component of the current ‘education reform agenda’ in the United States works . . . The current ‘education reform agenda’ is bankrupt. There is no evidence that it can succeed. It is time to embrace a very different education reform agenda, the one that has proven itself in the PISA rankings.” A Caution While international measures such as PIRLS, PISA and TIMSS
are important as indicators of how well New Zealand children perform in relation to their peers in other countries, and while we would do well to pay heed to what others have to say about why some countries rise and others fall in the rankings, and what those in decline can do to reverse such trends, it is important to keep reminding ourselves that the solutions to eliminating the ‘long tail of school underachievement’ in New Zealand will only be found by looking in on ourselves to seek plausible explanations and realistic solutions. Schools do not create the inequality of school achievement although they can and do maintain and reinforce it through their policies and practices; schools, while part of the problem are also part of the solution, but can never be the whole of the solution. By far the greatest responsibility lies with those outside of schools who make decisions which impact so significantly on differential achievement. Politicians and policy makers set down various education policies concerning the allocation of resources to schools to ameliorate the long tail, yet sometimes these policies achieve the very opposite of the end sought – the allocation of the student achievement component (SAC), for example, designed to provide assistance for students in need of additional support with NCEA, has gone disproportionately to wealthy decile 9–10 private schools rather than needy decile 1–2 state schools because the Ministry of Education requires that a psychologist’s report be provided for each applicant and since poor schools, and even more so poor families, cannot afford the $500 plus cost of this then poor children in state schools in need are denied the very funding which the SAC was designed to provide – while the children of wealthy parents in private schools are catered for. (We might remind ourselves of what Beeby (1986) had to say about the
nt
Pr i
sic
Mu
al
isu oV
di
Au
The Final Piece In Your Licensing Puzzle
Entertainment ANNUAL SCHOOL LICENCE
Students only, rained out sports days, before & after school care, special treat.
PUBLIC SCREENING LICENCE
Members of the general public, fundraising. For details on fees contact roadshowppl@roadshow.co.nz or telephone (09) 820 8811
education principle which bears his name: “the principle did lay down a general direction of desirable change and was a fairly reliable touchstone to test whether any programme for action fell within its limits” such that “any proposal to raise the cost to parents . . . would have offended against it” (p.xxiii) – How have we come to a place where we can get it so wrong?). With policies like these in place then it should come as no surprise that New Zealand has a ‘long tail of school underachievement’ which shows no sign of reducing. It is time, then, for New Zealand to ‘embrace a very different education reform agenda’ to the one which has driven our education system since 1987. Until this happens, we are destined to remain in decline while all around us other nations outperform us.
International Association for the Evaluation of Educational Achievement (2012) Progress in International Reading Literacy Study 2011. Boston: Boston College. Ministry of Education (2008) The Adult Literacy and Life Skills (ALL) Survey: Overview and International Comparisons. Wellington. Ministry of Education (2010) PISA 2009 Our 21st Century Learners at Age 15. Wellington. Ministry of Education (2013) PIRLS 2010/1: An Overview of Findings From the Third Cycle of PIRLS. Wellington. Moir, J. (2013) ‘Under-resourcing’ behind privacy breach. Stuff. (11/10/2013) New Zealand Educational Institute (2013) Minister’s Attempt to Manufacture an ‘Education Crisis’ Doesn’t Match the Facts. (Press release, 27/9/2013). Wellington.
References
New Zealand Principals Federation (2013) What to do about New Zealand’s underachieving children. Wellington. 6/11/2013
ACT Party (2013) Policies – Education. Wellington.
OECD (2013) Programme for International Student Assessment. Paris.
Beeby, C. (198) Introduction. In W. Renwick , Moving Targets. Wellington: NZCER.
Shadwell, T. (2013) Literacy Programme has Failed. Stuff/Manawatu Standard. (6/8/2013).
Brook, K. (2013) Reading Recovery Doesn’t Lead to Underachievement. Christchurch, New Zealand: University of Canterbury Communications. 12 August. 6/11/2013
Treasury (2009) Challenges and Choices: New Zealand’s Long-term Fiscal Statement (2009). Wellington.
Duncan, G. & Murname, R. (Eds.) Whither Opportunity: Rising Inequality, Schools and Children’s Life Chances. New York: Russell Sage Foundation. Education Policy Response Group (2013) The Assessment of Teacher Quality: An Investigation into Current Issues in Evaluating and Rewarding Teachers. Palmerston North: Massey University.
Treasury (2012) Treasury’s Advice on Lifting Student Achievement in New Zealand: Evidence Brief. Wellington. Tucker, M. (2013) Statement on Release of PISA 2012 Results. Washington: National Centre on Education and the Economy. Tunmer, W., Chapman, J., Greaney, K., Prochnow, J. & Arrow, A. (2013) Why the New Zealand National Literacy Strategy has Failed and What Can be Done About it? Palmerston North: Massey University.
Spelling Matters!
If you have students in your school who do not spell correctly, they have a significant literacy issue that needs to be addressed. Teachers can use their students’ spelling errors to find out about their literacy skills
BUT! How easy is it for teachers to analyse spelling errors? To do this, teachers need an in-depth knowledge of the spelling system of English themselves. • How many sounds are there in space, design, jungle, station? • Write the /sh/ sound in seven different ways • Write the long /e/ sound in seven different ways • Why does the word count have an ou for the /ow/ sound but coward have an ow? • Why is the long /e/ spelled with an e pattern in be and sea, but a y pattern in baby and happy? If teachers don’t know about the spelling system of written English, how can they teach it? Use Joy’s online professional learning site to up-skill teachers’ knowledge of the spelling system of English. For quiz answers contact Joy Allcock, sus@ihug.co.nz
ONLINE TRAINING FOR SPELLING! with Joy Allcock
Run your own professional learning This online training provides downloadable material and several hours of video clips which cover: • The background and theory of how we learn to read and spell • How to use Joy’s spelling resources (Sounds Like Fun, Switch on to Spelling, Spelling Under Scrutiny) • Assessments – their purpose, how to administer them and how to analyse the results • Classroom practice – video clips of teachers in action
Cost for a one-year licence $115 for individuals $400 for schools of 1-5 teachers $575 for schools of 6 teachers + (GST inclusive)
www.teachmetoteach.co.nz/spelling