What should I consider when interpreting my reports?

Below is a suggested order of steps to get the most out of your SEF data reports. Consider these steps when using your SEF reports to make judgements against the Student Performance Measures element in the Framework.

Note any measures in which your school is notably different from its similar schools group.

If you lead a primary school, especially pay attention to the proportion of students who are outside the government system in Year 7, and the difference in Year 5 scores between leaving and staying students. If both of these numbers are large and positive (proportion above 20 per cent; difference in scores above 20), then it indicates that many of your higher performing students could not be tracked until Year 7, and the measures that rely on Year 7 scores (VA5-7, and Year 7 students in top 2 bands) may not be accurate reflections of your school.

If you lead a secondary school, especially pay attention to the proportion of schools in your similar schools group that are partially selective. If you lead a non-selective school and this proportion is high (more than 20 per cent), you might expect that in most measures, your similar school group will be higher-performing than your school (because it has a lot of students who were selected because of high academic ability). If you lead a partially selective school, you might expect that in most measures, your similar school group will be lower-performing than your school (because your school has a lot of students who were selected because of high academic ability). Adjust your expectations accordingly.

The black horizontal line in each graph represents the growth of the average NSW government student between the two scholastic years (e.g. Year 3-Year 5). The blue square indicates an estimate of the growth that we would expect this average student to achieve if they were in your school. The range above and below the blue square represents the confidence interval. The value of your school could be anywhere within this range.

Value-added graph showing confidence intervals and statistical significance

When interpreting this report, take note of four things:

  1. Whether the blue square is above or below the horizontal line. This indicates whether the best estimate of the value your school is adding is above or below the value added by the average school. However, do not put too much stock in the precise position of the square – your school’s performance could be anywhere within the confidence interval.
  2. Whether the confidence interval is above or below the horizontal line. If the confidence interval does not overlap the horizontal line, we are highly confident that your school adds significantly more (above) or less (below) value than the average school. If this is happening for your school, you should put a lot of stock in this.
  3. The VA category for your school. This is shown in a blue box below the left hand graph. This is determined based on whether the estimate and confidence interval are above or below the line. This is a short-cut to help you interpret the data. You do not have to self-assess as ‘Delivering’ simply because one of the VA categories for your school says ‘Delivering’. However, it might take a lot of other evidence to credibly justify a self-assessment of  ‘Excelling’ if your VA category is ‘working towards Delivering’. Note that in many cases this category might be different depending on the scholastic year range you are looking at.
  4. Trend over time. Note whether the blue square and confidence interval are increasing or decreasing over time. Schools that have below-average value-added can nevertheless be improving their contribution to students’ growth every year, or vice-versa for above-average schools.

This report is important because it contains the Premier’s Priorities targets of increasing the proportion of students in the top two NAPLAN bands for Reading and Numeracy.

Screenshot of SEF attainment report

When interpreting this report, take note of four things:

  1. Your school’s performance relative to any SEF thresholds indicated on the graphs. These are indications of the performance that is expected from schools at each level of the Framework. If your school is above the ‘Excelling’ line, this does not by itself mean you are Excelling. However, sometimes these can be useful guidelines in interpreting the graphs.
  2. Your school’s performance relative to your similar schools group. While the government school average is reported in the graph, this is not always the best comparison, especially if your school is very different from the average school (for example, very high- or low-SES). It is often better to compare to your similar schools group, which represents the students in other schools that are most similar to yours. If your school is consistently exceeding its similar schools group, it might be an indication that your practices are particularly effective.
  3. Trend over time. Even if students at your school are high-performing, if there are large, consistent decreases over time, it may be cause for concern. The trend for your school may jump around a lot from year to year, especially when the sample size for your school is small. When this happens, don’t read too much into the change between two individual years. Also take note of the scale of the right-hand graph – sometimes a big jump in the line can represent only a few percentage points.
  4. Consistency between different measures. It is important to look at your reports together and be realistic about what your results are telling you about your school, rather than singling out specific items because they are positive. In some schools, all of the measures might be indicating consistently high or lower performance. Other schools might have differential performance between year levels, or between higher- and lower-performing students (as measured by the top two bands and National Minimum Standard measures). These differences should be acknowledged; it is important to identify the things your school is doing well, and the things that may need to be improved.

If you lead a secondary school, keep in mind the measure from the Retention report when looking at the HSC top two bands measure. Because this is expressed as a proportion of the number of students undertaking the HSC, if you have recently increased retention in your school a lot, it could result in a decline in the proportion of students with results in the top two bands. Compare the trends in both measures over time and relative to the similar schools group when interpreting these results.

The Attainment of Equity Groups report is also relevant to the Student Performance Measures element of the School Excellence Framework. However, because it compares performance between subgroups (high-SES and low-SES students, or Aboriginal and non-Aboriginal students), these measures may have very small sample sizes in one or both groups in many schools. This often results in big movements in the trend over time. If this is happening in your school, do not put too much stock in this report.

< What elements of the Framework does this data relate to?
How can I use my SEF data reports with other data? >
Return to top of page