Analysing HSC results

This resource outlines best practice examples for analysing your school’s HSC results.

Examples have been provided by Wyndham College and St Marys Senior High School. These schools analyse their HSC results on an annual basis, to identify their successes, as well as areas for improvement. This helps with communicating HSC successes to the community and informing future planning, including future curriculum offerings, staff professional learning and student learning support.

All reports can be found in the Scout HSC app.

We would like to thank the staff at Wyndham College and St Marys Senior High School for their valuable input into this resource.

Using the Average HSC Score vs SSSG/State report

Image: Average HSC score vs SSSG/State

.

This report allows for comparison against the state average and a statistically similar school group (SSSG) for overall school results, Key Learning Area (KLA) or individual course results. You can look at a specific HSC year or multi-select a number of years to analyse trend data.

Curriculum Head Teachers could consider:

  • What programming, assessment or teaching practices took place, which may have influenced the trends?
  • How does each gender compare against the cohort?
  • How does each equity group (Aboriginal and Torres Strait Islanders, English as an Additional Language/Dialect) compare against the cohort?


Leadership teams could consider:

  • What are the overall strengths?
  • What are the overall areas for improvement?
  • Where is curriculum differentiation needed?
  • Are assessment tasks targeting all students, or just some students?

Using the HSC Results report

HSC Results report
Image: HSC Results report

.

The first tab of this report shows HSC performance across Key Learning Areas (KLAs), in the form of a box and whisker plot (or boxplot). More information about boxplots can be found in Using data with confidence on the CESE website.

The second tab of this report is the HSC Results in Bands report, which shows the distribution of student results in the HSC by course and band.

Curriculum Head Teachers could consider:

  • What programming, assessment or teaching practices took place, which may have influenced the trends?
  • What do the course median and means (and the difference between them) tell us?
  • What does the performance attainment for the middle 50% of the cohort tell us? (And the top and bottom 25%?)
  • Has there been a change in HSC band achievement?
  • How does each gender compare against the cohort?
  • How does each equity group (Aboriginal and Torres Strait Islanders, English as an Additional Language/Dialect) compare against the cohort?


Leadership teams
could consider:

  • What are the overall strengths?
  • What are the overall areas for improvement?
  • Where is curriculum differentiation needed?
  • Are assessment tasks targeting all students, or just some students?

Using the Student-level NAPLAN 9 Reading and Numeracy vs HSC report

Student-level NAPLAN 9 Reading and Numeracy vs HSC report
Image: Student-level NAPLAN 9 Reading and Numeracy vs HSC report

.

This report displays each student as a dot on a scatter plot, showing their NAPLAN Year 9 reading and numeracy performance (combined average), and their combined HSC performance.

It can be used to determine whether students are maintaining, improving or regressing in terms of their strength of performance from NAPLAN 9 reading and numeracy to the HSC. High performing NAPLAN students in reading and numeracy would be expected to perform well on the HSC and thus the student’s dot would be expected to be within the top right area of the chart. You can drill down into KLA and individual courses to provide deeper analysis of results.

When analysing this report, Learning Support staff could consider:

  • What attainment and growth has taken place for students who were on Individual Education Plans, those who applied for HSC special provisions or those identified as ‘at risk’?

Leadership teams could consider:

  • Where is curriculum differentiation needed?
  • Are assessment tasks targeting all students, or just some students?

Using the Student HSC Results report

The Student HSC Results report
Image: The Student HSC Results report
  • This report enables analysis for individual students. It can demonstrate attainment and value-add, both collectively and individually.

    Relevant staff members (for example, wellbeing support) could consider:

    • patterns, strengths and areas for improvement for students on an Individual Education Plan/Special Provisions
    • patterns, strengths and areas for improvement for all EAL/D students
    • patterns, strengths and areas for improvement for all Aboriginal and Torres Strait Islander students.

  • Tips and tools to help analyse HSC reports

    Tip 1: Use data from multiple sources

    Scout reports, combined with internal and external data sources can help inform your analysis.

    Sources of data include:

    • School-based data
    • NSW Education Standards Authority (NESA) student performance packages
    • Post-school destination data through University Admission Centre (UAC) reports
    • Average HSC score vs statistically similar school groups (SSSG)/State report in Scout
    • HSC results report in Scout
    • Student-Level NAPLAN 9 Reading and Numeracy vs HSC report in Scout
    • Student HSC Results report in Scout.

    Tip 2: Break analysis areas up into specialised teams


    At Wyndham College and St Marys Senior High School, different teams focussed on areas most pertinent to their work.

    Leadership teams focussed on data at an aggregate level, identifying whole-school areas for improvement. Curriculum Head Teachers, alongside their staff, focussed on their particular subject areas.

    Careers and Transition teams focussed on post-school data, while multiple teams investigated data broken down by equity groups and groups with special learning needs.


    Tip 3: Provide screenshots of datasets and narrative to explain the findings


    In both schools, there was a requirement to provide screenshots of the datasets within Scout, with accompanying narrative to explain and discuss each dataset. With each piece of narrative, there were suggestions for next steps and a discussion about the implications for future practice.


    Tip 4: Monitor and analyse internal assessment data using TEC


    The Educator Calculator (TEC) is a useful tool for analysing your internal data, and seeing how it relates to HSC outcomes. You can use it to create interquartile datasets, similar to the boxplots found in the Scout HSC Results report, in the School Performance app.

    To use TEC, simply enter the marks from assessment tasks into the Excel tabs. It will automatically calculate:

    • number of values / sum of values
    • the mean, median and modes
    • minimum and maximum
    • variance
    • standard deviation
    • confidence interval amount and scale
    • quartiles and interquartile range
    • percentiles
    • boxplots.

    Example

    In this example, HSC assessment results were entered for a 2017 Economics course.

    • TEC 1 tab is Task 1 – Crime: multiple choice and essay format
    • TEC 2 tab is Task 2 – Family: essay format
    • TEC 3 tab is Task 3 – Consumers: essay format
    • TEC 4 tab is Task 4 – Trial HSC examination
    • TEC 5 tab is the final HSC school assessment results
    • TEC 6 tab is the actual 2017 HSC results
    • Tab 7 ‘Boxplots comparisons’ compares all results.


    To access TEC and learn more, visit: https://www.cese.nsw.gov.au/effective-practices/using-data-with-confidence-main


    Tip 5: Your analysis should include suggested next steps


    Some areas may require additional conversations and further analysis to try and identify the root causes of the trends. Wyndham College and St Marys Senior High School both identify clear next steps in their analyses. In the past, next steps have included:

    • professional learning for staff in the delivery of HSC courses
    • program modification
    • curriculum differentiation
    • milestones for future practice
    • curriculum offerings to meet the needs of changing cohorts
    • alignment of student attendance and performance outcomes
    • more course information provided to students
    • post-school transition planning
    • establishing consistent school-based review procedures to evaluate current assessment tools
    • whole-school approaches to assessment design that can meet the needs of all students.

    Download the printable version of Analysing HSC results (PDF 635.38KB)

    Return to top of page Back to top