Evaluating curriculum implementation
Guides for schools in using evaluative thinking practices to plan, monitor and evaluate activities aligned to the phases of curriculum implementation.
This resource is designed for school principals, executive teams and school staff. Directors, Educational Leadership (DELs) and Principals, School Leadership (PSLs) can also use this resource to guide schools with planning, monitoring and evaluating their SIP and the processes of the QDAI framework.
What and why
Curriculum reform provides an opportunity for schools to place curriculum at the centre of school planning. Effective curriculum implementation drives student growth and attainment, and school improvement. This requires:
- evaluating curriculum implementation activities that focus on the Leading, Teaching and Learning domains within the School Excellence Framework.
- a focus on curriculum implementation in areas such as educational leadership, staff capabilities, learning and development, improvement goals, policy, data skills and use, classroom practice, and assessment.
When and how to use
Understanding the principles of evaluative thinking and applying effective evaluation processes enables schools to investigate their curriculum implementation initiatives in a meaningful way. Effective curriculum implementation is an iterative and continuous improvement process, occurring simultaneously for different syllabuses as they are released.
This resource can be used to:
- build staff understanding of what constitutes effective evaluation practices
- strengthen the reliability and validity of evaluation of curriculum implementation practices
- deepen awareness of the principles of evaluative thinking to analyse data effectively, and
- apply an evaluative mindset to the core processes of aligning curriculum implementation to the Strategic Improvement Plan (SIP) and in developing Implementation Progress Monitoring (IPM).
Email questions, comments and feedback about this resource to firstname.lastname@example.org by using the subject line ‘School planning for curriculum implementation’.
Is a “disciplined approach to inquiry and reflective practice that helps us make sound judgements using good evidence, as a matter of habit.” (CESE, 2021)
“Together, leaders have a responsibility to develop constructive, informed and insightful professional relationships to continuously improve their professional practices. An integral part of the success of this relationship is the ability to lead learning and change through inquiry and evaluative thinking.” (Leading Collaboration for School Improvement Toolkit (PDF 639 KB) Staff only)”
Key principles of evaluative thinking include:
- suspending judgement and being aware of potential bias
- asking important questions
- using existing evidence well
- strengthening the evidence base.
To effectively evaluate curriculum implementation, schools need to ask the right questions.
Important considerations include:
- clarifying questions upfront to target the analysis process
- ensuring questions are focused and succinct
- tailoring questions to the correct phase of the implementation journey and the type of evaluation being conducted (process or outcome evaluation).
There is a range of different perspectives that can inform the questions being asked:
- the journey so far – how has the plan been implemented (To what extent? How effective?)
- progress towards goals – change from baseline or distance from target (How is it tracking? Unintended outcomes?)
- promising practices – innovations to strengthen (Should this practice be scaled?)
- the working environment – lessons learned and the impact on implementation (What were the enablers and barriers?)
- return on investment – cost-effectiveness of implementation (Was the initiative worth it?)
- the big picture – new insights or opportunities, review of focus (What needs to be adjusted for future activities?).
Once questions have been established, schools need to ensure that the data they are collecting can answer the question being asked.
- the type of data (qualitative, quantitative)
- the collection method (self-report, observation, assessment)
- the scale (granular – student, class, teacher or aggregate – whole-school).
Remember, where possible, schools should triangulate multiple sources of data to ensure perspectives are balanced. They should also apply the principles when analysing the data.
- Evidence of activity:
- What has taken place for curriculum implementation?
- What did we do?
- Evidence of process quality:
- the quality of the curriculum implementation activities
- How well did we do it?
- Evidence of impact:
- curriculum implementation outcomes
- What difference did it make?
Table 2 – examples of data that support the evaluation of curriculum implementation activities. These are suggestions for schools to consider and not a progression of data sources.
|Evidence of activity||Evidence of process quality||Evidence of impact|
Professional learning records
Document analysis (teaching and learning programs)
Document analysis of teaching and learning programs
Staff or faculty meeting minutes
Professional learning exit slips
Professional learning exit slips
Perceptual data: Tell Them From Me (TTFM), student and parent surveys
Perceptual data: TTFM, student and parent surveys
Scope and sequence documents
Focus group responses
Internal and external student assessment data
Student work samples
Moderation feedback sessions
Focus group responses
Whole-school processes and procedures
NAPLAN (value-add, expected growth)
The QDAI is an evaluative thinking framework to guide evaluation and support the systematic collection and interpretation of evidence for curriculum implementation.
The following tools can be used to guide schools with using and reviewing the framework.
The following questions can be used to guide your curriculum implementation evaluation practices.
When planning to evaluate curriculum implementation, consider:
- In what ways has the school leadership team provided opportunities to enhance evaluative thinking practices?
- How is evaluative thinking used to monitor curriculum implementation across the school?
- What opportunities are created to discuss potential areas of bias when planning evaluation?
- Is the focus of the evaluation clear? (aligned to SIP initiative; an existing problem to be investigated further; an important question to answer)
- What evidence needs to be obtained e.g. evidence of activity, process quality, or outcomes?
- How can the QDAI framework be applied to support your evaluation process?
- Has the team considered if existing evidence sources and evaluation tools could be utilised?
- Who will be involved in the evaluation process? (Who will lead the evaluation? Who is on the evaluation team? Audience?)
When using evaluation tools to gather data and evidence of the efficacy and impact of curriculum implementation, consider:
- Is the method or evaluation tool suitable, reliable and valid?
- Does the tool consistently measure what it is supposed to measure?
- Does the tool allow reliability across time? (test-retest reliability), across items (internal reliability) or across evaluators (inter-rater reliability)
- Does the tool accurately measure what it is supposed to measure?
- What critical factors may need to be controlled so that results are valid and reliable?
- Is the ‘inquiry’ or ‘key evaluation question’ focus narrow?
- Is the sample size appropriate? Does it broadly represent the target audience?
- Has more than one evaluation point or method to collect evidence been obtained?
- Have evaluation tools been designed in a collaborative way?
When analysing results to determine meaningful evidence of curriculum implementation, consider:
- Are findings generalisable across contexts? (school, faculties or stages, teams, classes).
- Has collaborative analysis of the data been undertaken?
- Have results been triangulated with other sets of conceptually similar data?
- Can you determine the ‘effect’ size?
- Do the findings prove any statistical correlation or causation?
When determining the next steps:
- Do the evaluation findings have real world significance to your context?
- How do the findings inform your next steps?
- Where are the opportunities to strengthen and scale success? In what ways will solutions for ongoing areas of concern be addressed?
- What systems and processes will continue as a part of the next iteration of the curriculum implementation improvement cycle?