Research Report
Details
Citation
Gardner J (2010) Report from the Working Group on the InCAS Errors of October 2009. Department of Education for Northern Ireland. http://www.deni.gov.uk/incas_working_group_report.pdf
Abstract
1. There is little doubt that the errors communicated to schools during October 2009 have had considerable impact on the confidence of school principals, teachers, parents and pupils in the results of InCAS assessments. The first point to be made in this report is that the error situations did not arise as a result of any problem specific to the design of the InCAS system or its functions. They had been introduced into the computer coding in 2009 and did not exist during the two previous years of successful InCAS usage. The errors were entirely human-based with simple fixes that involved small corrections in the computer software that carries out the analysis of pupils' scores. As such they are unacceptable in terms of quality of service and should not have happened.
2. The first error occurred in the General Maths scores of the 94,439 pupils who took this assessment in 886 schools. The results of 34,271 pupils were potentially affected before the error was fixed. These were pupils who attend the 328 schools that had already uploaded their results and downloaded the analyses before the error was corrected. Of these, 79 schools had begun to communicate the erroneous age-equivalent scores to the pupils and parents.
3. Once the error was detected CEM corrected the scores and analysed them to establish the extent of the errors across the Year groups. This analysis showed that 60% of the affected pupils (20,472) had errors in their age-equivalent scores of 1-6 months (with another 3,129 having errors of less than one month). Of the remainder, 7% (2,450) had errors of more than one year. The large majority of the pupils therefore had relatively small errors in terms of age-equivalent scores.
4. Even without a human error situation, it is not unreasonable to consider that parents will ordinarily be concerned by an age-equivalent score below their child's real age, thinking, as they might, that their child is falling behind. Their child might indeed be falling behind but teachers are trained to explain that the InCAS scores are estimates that can be influenced by various circumstantial factors. The teachers are required to use the scores to augment judgements of strengths and weaknesses that are based on their own, more extensive knowledge of the children.
5. In the circumstances of human error, with incorrect age-equivalent scores communicated to parents (and their children) and with teachers and principals required to explain the situation and its implications, it is reasonable to assume that considerable confusion and frustration would have developed. The most substantial impact would have been felt by parents (and their children) and the teachers who had to cope with large discrepancies in the age-equivalent scores that had already been communicated. But many principals and teachers, in the 79 schools mentioned above and in the 83 others that had begun to prepare for meetings with parents, would have had the added burdens and frustrations of repeating the analyses, interpretations and print-outs.
6. These various figures indicate that overall it was a small proportion of schools, parents and pupils who experienced direct impact of substantial errors in communicated scores. However, the media coverage of the errors would have raised concerns and shaken confidence in all schools and would have extended to many more parents with primary school children than those directly affected.
7. The situation prompted immediate action at the highest levels, supported by public statements from the Minister and the Council for Curriculum, Examinations and Assessment (CCEA). Assurances were demanded from and given by the supplier, the Centre for Evaluation and Monitoring, University of Durham (CEM) that the error was corrected and that the whole system had been thoroughly checked. However, the second error came after these assurances and immediately inflamed the concerns and frustrations of the schools. Even though this error, in the calculation of standardized scores, did not affect the statutory reporting to parents, it did affect all aspects of the various InCAS assessments that are designed to inform teachers' judgments about their pupils' strengths and weaknesses. Again it was relatively easy to correct but the impact of the error, coming on the back of public assurances of all results now being robust and dependable, was something of a public relations disaster and a major blow to schools' confidence in the InCAS system.
8. These comments (and the detail of the report below) present a gloomy but fair assessment of the impacts of the impact on schools', parents' and pupils' confidence in InCAS. However, in the manner of a ‘silver lining to every cloud', there are arguably two significant benefits arising from the situation.
9. The first is that teachers, parents and pupils now have good reason to accept that assessment results can be subject to error, albeit usually as a natural degree of in-built uncertainty rather than the more explicit conditions of human error experienced here. For many more people, interpreting scores may well in the future be a more measured process of being aware of potential errors (either the natural levels of uncertainty or human mistakes) and being vigilant about checking them.
10. The second benefit, albeit a little belated, is the recognition beyond any doubt that despite the InCAS assessments not being designed to be ‘high stakes', there is an exceptionally high perception of their importance among parents and schools (teachers and principals). It follows that greater attention is now being paid to quality control in all aspects of the assessment process and reporting. DE and CCEA have initiated key areas for action including ensuring that the quality control processes are sufficient to accept assurances from CEM with confidence, and identifying recommendations for rebuilding the confidence that existed in InCAS before the recent errors occurred.
11. In this latter respect this report sets out a number of recommendations aimed both directly at the error situations that arose and on wider aspects of computer-based diagnostic assessment in primary schools.
Status | Published |
---|---|
Publication date | 15/04/2010 |
Publisher | Department of Education for Northern Ireland |
Publisher URL | http://www.deni.gov.uk/incas_working_group_report.pdf |
People (1)
Professor, Education