New Zealand Principal Magazine

What are the chances of gaining ACET?

Cedric Croft · 2019 Term 4 November Issue · News

What are the chances of gaining ACET? Cedric Croft

Eligible teachers may apply for the Advanced Classroom Expertise Teacher Recognition (ACET) award. ACET is awarded on the basis of a principal’s recommendation and a portfolio of evidence which is evaluated by the ACET panel. The first step towards gaining ACET is the principal’s recommendation.

The Ministry have noted that the assessment sheet for each portfolio provides the justification for rejecting a principal’s recommendation. However ACET policy is clear that the assessment sheet is for candidates who request feedback and it appears therefore, that the panel’s practices are contrary to Ministry policy.

1. Principals’ recommendations The Ministr y of Education describes a principal’s recommendations as follows: ‘All certified portfolios which carry a recommendation from the teacher’s principal will be evaluated by the Panel . . . The panel’s role is to confirm or otherwise the principal’s recommendation for ACET recognition. In doing that the panel plays a crucial role in ensuring national consistency.’

2. Portfolios of evidence All portfolios with a principal’s recommendation are evaluated by the ACET panel. The panel has six members and a chairperson. Ministry ACET is awarded on the documentation obtained under OIA provisions describes the operation of the basis of a principal‘s panel as follows.

recommendation and a portfolio of evidence which is evaluated by the ACET panel.

The principal’s recommendation is seen as a central aspect of the ACET recognition process as Ministry policy statements note also, ‘. . . principal’s recommendation therefore carries significant weight in the ACET recognition process’. (OIA 1088111) However, from 2014–2018 inclusive, 379 or 38 per cent of applications with principals’ recommendations were rejected by the panel, meaning that despite their principals’ best judgement these applicants were denied this award. The validity of panel decisions appears never to have been established. The 38 per cent of portfolios with principals’ recommendations but not confirmed is the average over five years of the ACET scheme. In one year 44/75 portfolios with recommendations (58 per cent) were rejected by the panel. This is more than surprising considering the in-depth knowledge each principal must have of each applicant’s achievement of all ACET criteria. It is clear that the panel will make a final decision but Ministry policy specifies that the panel, ‘ . . . will need to provide justification for a departure from the principal’s decision.’ An obvious question is; how has the panel met this requirement? The panel appears not to provide specific justification for a departure from the principal’s recommendation. There is no indication on the panel’s assessment sheet to show that a principal’s recommendation has been duly considered or to show just why the principal’s recommendation has not been accepted.

‘The panel is made up of seven independent education experts who have demonstrated experience in making evidence-based critical decisions . . . All panellists are trained on the first day the panel sits, and expectations and the approach taken are made clear by the Chair at this time.’ ‘Two members of the panel consider each portfolio along with the principal’s recommendation. The panellists look for sufficient evidence within the portfolio that a teacher meets all of the professional criteria . . . ’ ‘To ensure portfolio evaluation is consistent, fair, and robust, each portfolio is moderated by a second pair of panellists. The panel chairperson then reviews all decisions for consistency and to ensure appropriate feedback are provided . . . ’ (OIA 1088111)

For the period 2014–2018, 992 portfolios were submitted with 613 confirmed for a success rate of 62.33 per cent. In terms of simple probability the chances of gaining ACET have averaged 0.62. However, 2017 stands out as well below the 5-year average with just 31/75 portfolios confirmed for a success rate of 41.33 per cent. For this year the chances of gaining ACET dropped to 0.41 and the proportion of principals’ recommendations rejected was a 5-year high of 58.67 per cent. Why was there such a drop for 2017? The ACET Chairperson’s report to the Ministry for that year acknowledges the sharp decline in portfolios stating, ‘As a panel we reflected on this change but could not identify any particular reason for the change.’ With no explanation in this report for the sharp drop in successful applications, further data were requested from the Ministry under OIA provisions.

3. Variation between panellists Data for 2017 (which determined awards for 2018) showed considerable variation between panel members. Confirmation rates ranged from 12.5 per cent to 53.33 per cent. Each of the 16 portfolios associated with the evaluator with least confirmations had a probability of 0.125 or 1/8 chances of confirmation. Each of the 30 portfolios associated with the evaluator with most confirmations had a probability of 0.53 or about 1/2 chances of success. The two panellists with the least confirmation were associated with passing 10/44, or just 23 per cent of portfolios they evaluated. These same two panellists were associated with failing 34/44, or 77 per cent of portfolios evaluated. Candidates whose portfolios were evaluated by either panellist had diminished chances of success. As noted earlier panellists worked in pairs so additional data were obtained on the confirmation rates for each pair. There were twelve pairs of panellists with confirmation rates varying from 0.00, i.e. no confirmations (9 candidates) from two pairs of panellists; to 0.70 or 7/10 chances of being confirmed (10 candidates) for one pair. Variations in the chances of success which range from 0.00 to 0.70 suggest extreme unreliability which may be seen as unacceptable for high stakes assessment such as ACET. Similar variations were found for confirmation rates by the school decile of applicants. Candidates from decile 9 schools had the greatest confirmation rates with 7/11 or 63 per cent successful. Candidates from decile 10 schools had the least confirmations with 3/13 or 23 per cent successful. These data

TAKE THE LEAD ON SAFER SCHOOL JOURNEYS Road safety resources for teachers and school leaders. • Curriculum units Years 1-13 • School Traffic Safety Teams manual • Advice for families and school policies

www.education.nzta.govt.nz

cannot identify particular bias towards portfolios from decile 10 schools but on the face of it inequitable outcomes have been revealed. Ministry criteria as quoted earlier note ‘ . . . the panel plays a crucial role in ensuring national consistency.’ No evidence was forthcoming on consistency over the five years, nor was there any indication of the procedures the panel would adopt to achieve this outcome. Given the unreliability as reported for 2017, it seems unlikely that consistency from year-to-year has ever been achieved. What are possible explanations for the wide variation in confirmations between panellists? According to information received, portfolios were distributed among panellists on a more or less random basis, so there are no reasons to suspect that some panellists’ received portfolios of lesser quality than others. What appears more likely is a lack of common understanding among panellists regarding interpretation of criteria for evaluation. This is not surprising given the broad and subjective nature of some criteria which form the basis for portfolio evaluation. Furthermore, there are no explicit assessment criteria or rubrics available to panellists and there appears to have been no validation of the technique of having pairs of panellists working together to arrive at an agreed outcome. This approach seems to have no precedent in the literature on portfolio evaluation. Monitoring procedures in the form of establishing inter-rater reliability were either ignored or unknown. 4. Moderation It is reasonable to assume that moderation of panellists’ evaluations as outlined in Ministry documents would identify inconsistencies in the original evaluation. But in 2017 this was not the case. Despite the marked variation between panellists and pairs of panellists, no decisions made in the original evaluations were changed by moderation. In 2017 the moderation process was ineffective. 5. No provision for review Despite the variation in performance of panellists and the fluctuations in confirmation rates from year-to-year, there is no provision within ACET policy for candidates to seek a review of the panel’s decision. This policy of ‘no review’ seems without precedence in the external examinations environment in New Zealand. This policy is to the detriment of candidates who may have had their portfolios incorrectly evaluated in the past, but it helps maintain the Ministry view that ACET evaluations are of ‘high quality’ despite data reported here! 6. Responsibilities for operation of ACET Where do responsibilities lie for the fair and equitable operation of the ACET? Ministry officials have noted that the panel chair has delegated authority to deliver outcomes as laid out in ACET policy documents. This seems not to have happened. The Ministry has a central role in the operation of the scheme but ACET policy is determined by the ACET Working Group which has representation from NZEI, STA and the Ministry. NZEI, the Ministry, and the Working Group have not accepted the data reported here and have stated repeatedly in correspondence that ACET procedures are ‘robust’, they ‘have confidence in the panel’, they ‘support the outcome of the assessment process’ and ‘the outcome would be the same’ no matter which panellist evaluated a portfolio. This latter comment

was attributed to the Working Group and suggests they are out of their depth when confronted with issues of reliability. Why there has been such uncritical support for the panel outcomes and ACET policies from NZEI, the Working Group, and the Ministry is a matter for speculation. What could have been done regarding the data reported to the Working Group? Given a 24 per cent decline in confirmations from 2016 to 2017 (for 2018 confirmations rose by 10 per cent) their first obligation was to acknowledge the Ministry data instead of trying to refute it. A strong case existed (and exists still) for all failed portfolios from 2017 to be reviewed independently, particularly those evaluated by the two panellists noted earlier, and all portfolios from decile 10 applicants. If the standards of previous years had applied in 2017, on average an additional 15 or so portfolios would have been confirmed. 7. Conclusions What then have been the chances of gaining ACET? From 2014–18 on average, the chances were 0.62. In 2017 the average chances dropped to 0.41. But the average value masks that the actual chances varied from 0.00 to 0.70 depending on who evaluated a portfolio. The choice of panellists appears to have had too much influence on the final outcomes for candidates. It is clear also, that the marked drop in confirmations for 2107 resulted from the numbers of portfolios failed by two panellists as noted earlier. This reasonably simple conclusion seems to have escaped the notice of the ACET Working Group, the panel chair, and the Ministry.

The data suggest a lack of validity, reliability, and fairness in the operation of the ACET panel. One major failure of ACET in my view is the Ministry’s refusal to incorporate a review process as standard practice. A review process would provide a safeguard for candidates and bring a form of external accountability to the operation of the panel and the policies of the Working Group. A second major failure is the lack of information regarding just how the panel considers each principal’s recommendation and just why on average more than one-third of these recommendations have been overturned. It may be of particular concern that for one year almost 60 per cent of principals’ recommendations was overturned by the panel without the justification required by Ministry policy. The evaluation of many portfolios in 2017 seems to have been carried out in a manner well short of accepted standards for high stakes assessment. Some unsuccessful candidates might find it deeply concerning to know that two panellists had such a negative influence on the outcomes for their portfolios. Some successful candidates might be grateful their portfolios were evaluated by other panellists.

Class Lists Really Matter Create your best class lists in a fraction of the time Drag and Drop

Pair or Separate

Easily Customized

Decision Support

Simple navigation to quickly create the classes you want.

Use information that matters to your school to balance classes.

Enter unlimited requests to pair or separate students.

Unsure which students to move? Use our one click fix!

www.ClassSolver.com 0800 369 308