Paper Item Analysis

The Paper Item Analysis report shows how the selected school has performed by NAPLAN item and compares this to NSW government schools (DoE State).

This report includes results only for the students who have participated in the NAPLAN test on paper (all domains including Writing). Online results are reported on a separate report.

This report is available to school-based staff and Directors, Educational Leadership.

How will this report benefit me?

The report allows schools to analyse their NAPLAN performance in each item/criterion (in case of writing; also known as rubric) of the assessment, and compare this to the performance of DoE students in the same item. This can pinpoint strengths and/or gaps in the school’s teaching strategy, for example, some items/criteria may have been answered extremely well by the school, indicating a strength in the teaching of that topic or skill.

The report also allows schools to analyse the NAPLAN performance of individual students, for each item descriptor of the assessment. Used in conjunction with the school-level report, it can indicate which students may need additional help with specific topics or skills.

What does the report provide?

This report has information on:

  • School – Item Analysis (paper, non-writing),
  • School – Item Analysis (paper, writing),
  • Student over time (paper, Non-writing),
  • Student over time (paper, Writing)

School – Item Analysis (paper, non-writing):

Use the slicers provided to filter and further analyse the data:

  • What school were they in?
  • Which year?
  • Which assessment?
  • Which domain?

This report provides information on:

School % Correct by Syllabus Area

This chart displays the percentage correctness of the responses by syllabus area. Hovering over the chart displays Syllabus Code, Response (Correct – represented by a tick /incorrect – represented by a cross), Paper School Response and the Syllabus Outcome. The chart is sorted by Syllabus Code, however you can change to sort by Response Correctness or Syllabus Outcome.

Band Analysis – School compared to SSSG and DoE State

This chart compares the percentage correct responses between the School and DoE State by Question Band i.e. percentage of students within the school who answered the items correctly (% Correct of Exposed) vs percentage of students across the DoE schools who answered the items correctly (% Correct of Exposed) by question band. Hovering over the chart displays Question Band & School Correct % or State Correct %.

Item Analysis – School compared to SSSG and DoE State

This chart compares the percentage of correct responses between the school, SSSG and DoE state by each item. Hovering over the chart displays Question Code (Item ID), School Correct %, SSSG Correct % and State Correct %. The chart is sorted by Question Code, however you can change to sort by School Correct %, SSSG Correct % or State Correct %.

Test Item Details

  • This table displays each Question in the selected assessment and domain.
  • The actual question text is shown, Question link, teaching strategy link and stimulus link..
  • The question band is shown. The band assigned to a question indicates that students who achieved at that band for their overall score and above, would have been expected to get this question right.
  • School correct %, SSSG correct %, DoE State Correct % are shown at each question level. A question level comparative performance of the school is shown as SSSG Difference and DoE State Difference.

Question Details

This table displays details for an individual question, as selected in the Question List table.

The actual question text is shown, as well as a more detailed expected outcome, the continuum and syllabus.

Response Details

This table displays details for the responses to the individual question selected above.

  • For multiple choice questions, each potential response is shown with a flag showing which response is correct.
  • For text response questions, the correct response is shown.
  • The % distribution of responses is shown for the school, SSSG and State.
  • Distractor analysis is included where this has been provided by the data owner.

Student Item Map

Use the slicers provided to filter this section of the data:

  • Which scholastic year?
  • A specific group type?
  • A specific group?

This table displays the Student’s responses to the questions. A student’s correct response to the question is shown as a green circle, an incorrect response is shown as red diamond and a not attempted question is shown as a yellow triangle.

School - Item Analysis (online, writing):

Use the slicers provided to filter and further analyse the data:

  • Which school were they in?
  • Which year was it?
  • Which assessment?
  • Which criteria

This report provides information on:

Writing Score Distribution – School compared to SSSG and State

This chart compares the % writing scores by Writing Score distribution, of the School with that of the SSSG and DoE State. Hovering over the chart displays Response Code, School %/ SSSG %/ DoE State % and Total Score.

Writing Analysis – School Compared to SSSG and DoE State

This table displays the score distribution for each criterion, e.g. "Audience". The score distribution for the school is compared to that of the SSSG and DoE State.

Each criterion may have a different maximum score. All possible scores for each criterion are shown.

The Criterion, Criterion Description, Score, School %, SSSG %, DoE State %, Continuum, Syllabus ID, Syllabus Outcome, Stimulus link and Teaching Strategy links are shown.

Paper Writing Student Score by Criterion

Use the slicers provided to filter the data from this section downwards.

  • Which scholastic year?
  • A specific group type?
  • A specific group?

This chart displays student level writing score achieved, with a break down by each writing criteria. Hovering over the chart displays Student, Criteria, Score, and Writing Score. The chart allows to sort by Student Name or by Writing Score.

Student – Writing Response Details

This table displays student level Writing Response Details. Selecting a student from the chart above will filter this table to show only the selected student’s responses.

The table displays Student Name, Criterion, Score, Max Score, Criterion Description, Syllabus ID, Syllabus Outcome and Teaching Strategy link.

Student over time (paper, non-writing)

Use the slicers provided to filter and further analyse the data:

  • What school were they in?
  • Which scholastic year?
  • In which calendar year?
  • In an enrolment type group?
  • In a student group type?
  • An enrolment type?
  • A student group?
  • Which domain?

This report provides information on:

Item Analysis by Assessment and Item Difficulty

This chart compares a student’s item level performance (correct or incorrect) in the selected domain, in different Assessments by item difficulty. Each dot represents an item, green dots represent correct responses and red dots represent incorrect responses. Hovering over the dot displays Question Code, Difficulty, Assessment and Response Correctness.

Item Analysis by Syllabus

This chart displays student’s performance in questions, compares number of correct vs incorrect items, grouped by syllabus code. Hovering over the chart displays Syllabus ID, Response Correctness and Number of Items. The chart is sorted by number of items by Syllabus ID, however, you can change to sort order.

Number of Questions Asked by Syllabus

This table displays the number of questions a student is asked in the test by each syllabus code and syllabus outcome and also displays the total number of questions in the test. This table allows to sort by any column.

Item Analysis by Item Descriptor

This chart displays student’s item level performance, compares number of correct vs incorrect items, grouped by Item Descriptor/Question Description. Hovering over the chart displays Item Descriptor/ Question Description, Response Correctness and Number of Questions. The chart is sorted by Item Descriptor, however, you can change to sort by Number of Items.

Student over time (online, writing):

Use the slicers provided to filter and further analyse the data:

  • What school were they in?
  • Which scholastic year?
  • In which calendar year?
  • In an enrolment type group?
  • An enrolment type?
  • In a student group type?
  • A student group?
  • What is the student’s name?

This report provides information on:

Writing Score by Assessment Name and Criteria

This chart displays the student’s writing score for each assessment by writing criteria/rubric. The chart displays the score in each rubric. Hovering over the chart displays Assessment, Criteria and Score.

How can I use the report to support my school self-assessment?

This report can support schools to understand the effectiveness of teaching strategies at a school level. It assists schools with the Assessment and Reporting, Student Performance Measures, Effective Classroom Practice and Data Skills and uses elements of the SEF.

What should I look for?

Items where there is a large difference in performance between the school and SSSG/DoE. Review both the Item Descriptor and Syllabus Outcomes where a student has performed significantly below the school average. These may point to gaps in the teaching strategy for that topic or skill.

Look for Writing criteria where there is a large difference in performance between the school and DoE. These may point to gaps in the teaching strategy for that criterion/writing rubric.

Where does this data come from?

NAPLAN.

How frequently is data updated?

Annually.

Return to top of page Back to top