Targeted Early Numeracy (TEN): Final evaluation report

This report was originally published 20 September 2021.

Image: Targeted Early Numeracy: Final evaluation report

Summary

Targeted Early Numeracy (TEN) is an Early Stage 1/ Stage 1 intervention aimed at students in Kindergarten to Year 2 whose facility with number suggested they were at risk of scoring in the lowest two bands in Numeracy NAPLAN in Year 3. The intention of the intervention is to enable teachers to support Kindergarten to Year 2 students to achieve minimum standards of numeracy by the end of Year 2. TEN was developed and introduced by the NSW Department of Education in 2009 for use as a small group intervention.

The evaluation of TEN was conducted by the Centre for Education Statistics and Evaluation in 2018 and 2019 and considered the impact, implementation and use of the intervention.

Key findings

Impacts of TEN on student learning and teacher practice

  • We have no evidence that TEN is achieving its goal of supporting Kindergarten to Year 2 students’ facility with numbers to reduce their risk of scoring in the lowest two bands in numeracy NAPLAN in Year 3. We have been unable to effectively measure TEN’s impact on numeracy learning outcomes due to the reduced departmental oversight of TEN and the inconsistent implementation of the intervention in schools. One reason for this reduced departmental oversight was the delegation of key decision-making to schools, which left the department without data on which schools had adopted TEN and how they were deploying it.
  • We do not have outcome data to measure changes in teacher practice as a result of TEN, again due to reduced departmental oversight and inconsistent implementation of TEN.
    • However, most educators have reported increased confidence in teaching numeracy and understanding numeracy teaching practices through their use of TEN.
    • Educators also indicated that they adjusted their own teaching practice when using TEN through altering their implementation of the intervention, making curriculum adjustments and altering their learning and teaching strategies.

Why are we unable to effectively measure TEN’s impact?

The department has reduced oversight of the intervention, at least in part due to the move to delegate greater decision-making to schools and the transfer of professional curriculum support into schools. As a result:

  • The implementation of TEN is inconsistent. TEN is not being implemented in schools as was intended and therefore we cannot adequately measure whether it is meeting its intended goals.
  • There is inconsistency in implementation in terms of year group, targeted students, frequency of lessons, grouping of students, assessment of students and areas of the mathematics syllabus that are targeted through TEN.
  • The original facilitated training model has changed over time. There is now a lack of consistency in both the quality of TEN training and its delivery.
  • While principals still supported the implementation of TEN in their schools, educators no longer had access to the same intensive training model for TEN.

Lessons learned

The department needs to know whether interventions, such as TEN, lead to positive student outcomes. For the department to be able to measure the effectiveness of interventions, evaluation needs to be built into the development of interventions. This would enable the department to access necessary data to complete a rigorous and reliable outcome evaluation.

Evaluation should also be an ongoing process for the duration of the time interventions are implemented in schools, and in particular, should be prioritised during the scaling-up phase of an intervention.

Key considerations

In improving numeracy interventions and measuring their effectiveness, it is important that the department:

  • maintains and supports program fidelity by ensuring interventions align to the current syllabus outcomes
  • maintains adequate records and corporate administrative knowledge
  • provides educators with high quality, evidence‑based training and ongoing support and professional learning
  • builds evaluation into the development of interventions and prioritises evaluating interventions throughout their lifecycles.

It is also the responsibility of schools to ensure that interventions are implemented as intended in classrooms, and do not become a replacement for the syllabus.

Return to top of page Back to top