How to use Scout reports
Scout is the department’s new and improved Business Intelligence tool, developed to provide school and corporate staff with information about what’s working well, and what can be improved.
Download a print version (PDF 179.2KB) of how to use Scout for the School Excellence Framework or view the online version below.
What is Scout?
Scout provides data from disparate sources across the department as well as external sources, to help you with analysis, planning, reporting and data-driven decision making. With increasing data sets available, Scout is an invaluable resource for schools. Scout is particularly useful for self-assessment and external validation using the School Excellence Framework, as it provides reports that can be used as evidence across the Learning, Teaching and Leading domains.
More information about Scout reports, training, resources and support can be found on the Scout website (login required).
What can Scout tell me about my school?
Scout contains specific reports which are categorised into a number of dashboards. Dashboards relevant to self-assessment and external validation include:
- Enrolment and Attendance
- School Performance
- Student Performance
- School People Management
- Schools Dashboard
- Asset Management
What should I consider when interpreting my reports?
It is important to consider the sample size of any data set you are looking at. In general terms, the larger the sample size, the easier it is to start drawing conclusions. When you have a small sample size, there is the potential for greater variability within the sample, making it harder to draw accurate and valid conclusions. A sample size of at least 10 provides a base for which you can begin to draw some conclusions. If your school has small sample sizes, you may learn something by looking at the trend over time. Consistent patterns over a longer period of time give us greater confidence that the data shows real change rather than random variation.
Frequency of updates
Reports in Scout are updated at different frequencies depending on the original data source. For example, Official Attendance Census data is collected once per year and is therefore only updated every 12 months. The Current Enrolment report on the other hand is updated on a daily basis. Take the frequency of updates into consideration when you’re analysing your reports to make sure that your conclusions are relevant to the timeframe of self-assessment or external validation.
Reports in Scout are automatically scaled in order to fit the chart or graph to the size of the page. This allows for an optimised view. This automatic scaling can vary the view of one chart to the next, which can sometimes make it seem like there is greater or less variability in performance over time. The scale should always be considered first, prior to analysing the performance measure. For example, a NAPLAN report may have a scale in which each horizontal line represents a difference of 10%. This same report with a scale of 2% would appear to show much greater variability than the chart with 10% scale.
There may be circumstances in which you analyse Scout reports to assess whether there has been a meaningful change in performance over time. For example, you may want to evaluate the impact of initiatives introduced to address a strategic priority. It is important in these instances to ascertain whether an observed difference represents a real change, rather than normal fluctuations in the data. For more information on interpreting quantitative data, you can visit the School Excellence Framework Evidence Guide - Analysing Quantitative Data.
How can I use my Scout reports with other data?
Scout reports provide a detailed view of school performance in a number of areas. However, it is important that Scout reports are used alongside other internal and external data sources.
While Scout reports can provide you with data on a number of student, teacher, leader and school performance measures, it cannot explain how or why this level of performance has been achieved. For example, your Value Added report may demonstrate an increase in student growth over time that is statistically significant, but it cannot tell you why or how this has been achieved. To answer these questions, you need to look at a broader range of reports and other data sources.
Internal performance data is often very rich and can provide a lot of detail on student performance, particularly in years not covered by assessments like NAPLAN or HSC. This data could include A-E report card grades; results of formative assessment (through testing or homework) undertaken by teachers as part of their classroom activities; or assessment as part of specific interventions such as MultiLit.
In some cases, student performance data can identify particular focus areas that may be best investigated using methods such as teacher observation.
To see examples of the types of evidence that could be used in the self-assessment process, please visit the sources of evidence page on the Evidence Guide.
For guidance in how to collect and analyse data, visit guidelines for using data.
Scout as a source of evidence
The next page summarises the Scout reports that may assist schools with self-assessment and external validation for each element of the School Excellence Framework.