Online Item Analysis

The Online Item Analysis reports show how the school has performed against each NAPLAN item and compares this to NSW government schools (DoE State).

This report includes results only for the students who have participated in the NAPLAN test online (all domains including Writing). Paper results are reported on a separate report.

This report is available to school-based staff and network directors.

How does online tailored testing work?

NAPLAN Online uses a feature known as tailored testing, where the test automatically adapts to the students’ performance and asks the questions that match the students’ achievement level. All students of each year level start with questions of the same level of complexity. The computer system scores the students’ answers.

The student then progresses to the next testlet. The next testlet includes the questions that may be easier or more difficult than the prior testlet depending on the students’ performance. There are up to three testlets per domain.

Students are not exposed to all items, but are exposed to only the items in their tailored testing pathway.

Refer to the following for more details: http://www.nap.edu.au/online-assessment

How will this report benefit me?

The report allows schools to analyse their NAPLAN performance in each item/criterion (in case of writing; also known as rubric) of the assessment, and compare this to the performance of DoE students in the same item. This can pinpoint strengths and/or gaps in the school’s teaching strategy, for example, some items/criteria may have been answered extremely well by the school, indicating a strength in the teaching of that topic or skill.

The report also allows schools to analyse the NAPLAN performance of individual students, for each item descriptor of the assessment. Used in conjunction with the school-level report, it can indicate which students may need additional help with specific topics or skills.

What does the report provide?

This report has information on:

  • School – Item Analysis (online, non-writing),
  • School – Item Analysis (online, writing),
  • Student over time (online, Non-writing),
  • Student over time (online, Writing)

School – Item Analysis (online, non-writing):

Use the slicers provided to filter and further analyse the data:

  • Which school were they in?
  • Which year?
  • Which assessment?
  • Which domain? (e.g. Reading)

This report provides information on:

School % Correct by Syllabus Area

This chart displays the % correctness of the responses by syllabus area. Hovering over the chart displays Syllabus Code, Response Corrrectness, Count and the Syllabus Outcome. The chart is sorted by Syllabus Code, however you can change to sort by Response Correctness or Syllabus Outcome.

Band Analysis – School compared to DoE State

This chart compares the % correct responses between the School and DoE State by Test Band i.e., % of students within the school who answered the items correctly (% Correct of Exposed) vs % of students across the DoE schools who answered the items correctly (Exposed Correct DoE State %) by test band. Hovering on the chart displays Test Band, School Correct %, Highlighted, Total Participation, State Total Participation.

Item Analysis – School compared to DoE State

This chart compares the % of correct responses between the school and DoE state by each item. Hovering over the chart displays Question Code (Item ID), School Correct %, State Correct %, Item Difficulty and School Participation and State Total Participation. The chart is sorted by Item, however you can change to sort by School Correct %, State Correct %, Difficulty or Participation.

Test Item Details

This table displays each item/question in the selected assessment and domain.

The Item Descriptor shown is not the actual item/question text, but is the outcome expected of that item.

Skillset

To assist in organising items, items have been grouped into skillsets by item descriptor. Item descriptors were used to group skillsets as all items were not released by ACARA (see ‘Released’ definition below). Each skillset within a domain is derived from a group of similar skills that are conceptually aligned. For example, in Reading, vocabulary is a distinct skillset that requires students to identify, explain or analyse vocabulary used in a text.

Note: because items with the same item descriptor may have had a range of levels of item difficulty, skillsets may also cover a range of content complexity.

Sub Domain

The Sub Domain is a grouping of the skillsets within a domain e.g.  Language, Literacy, Literature in Reading and Measurement, Geometry, Statistics, Probability etc. in Numeracy.

Item Band

The band assigned to an item indicates that students who achieved at that band for their overall score and above, would have been likely to get this item correct. Students who achieved below this band would have been likely to get this item incorrect.

Item Difficulty

The Item Difficulty assigned to an item indicates that students who achieved at that Scaled Score and above would have been likely to get this item correct. Students below this scaled score would have been likely to get this item incorrect.

The Number of Students Exposed to the item, % of students exposed to the total students from the school participated in the assessment, % of students answered the item correctly within the school (% Correct of Exposed), and % of students answered the item correctly across the DoE schools (% Correct within DoE) are shown.

Released information indicates whether the full Item content is released by ACARA. Only a limited percentage (approximately 25%) of actual Questions/ Items are released publicly. When not released, a link to an Exemplar Item, an item of similar content, rather than the actual Item is presented.

Note: These Exemplar Items have the same Item Descriptor as the Item presented to the Student, however the Item Difficulty may vary.

Item Type is shown indicating the type of the item selected to match the skills, knowledge and understandings being assessed in the testlet pathways.

Refer to the following for more details: https://schoolsnsw.sharepoint.com/sites/HTLSRYAG/Shared%20Documents/QRG/Item%20Types.pdf

Click on a single item in the table to view details for that item in the table below.

Test Item Syllabus Details

This table displays details for an individual item, as selected in the Test Item Details table above.

The actual question text is NOT shown, however, the Question Code, Item Descriptor, Syllabus ID, and a more detailed expected syllabus outcome are shown.

Links to the Curriculum, Exemplar and Teaching Strategies are provided.

Student Item Map

Use the slicers provided to filter this section of the data:

  • What school were they in?
  • Which scholastic year?
  • In which calendar year?
  • In an enrolment type group?
  • In a student group type?
  • An enrolment type?
  • A student group?
  • Domain?

This table displays the Student’s responses to the questions. A student’s correct response to the question is shown as a green circle, an incorrect response is shown as red diamond, not attempted is shown as a yellow triangle and a question that is not exposed to the student is shown as blank.

School - Item Analysis (online, writing):

Use the slicers provided to filter and further analyse the data:

  • Which school were they in?
  • Which year?
  • Which assessment?
  • Which domain?

Test Version - Students are presented with one of two possible prompts/ test versions. This slicer allows to filter on the Test Versions.

Note: Links to the Stimulus and Genre corresponding to the test version are shown.

This report provides information on:

Writing Score Distribution – School compared to DoE State

This chart compares the % writing scores by Writing Score distribution of the School with that of the DoE State. Hovering over the chart displays Score, School %, DoE State %, School Participation and Total State Participation.

Online Writing Rubric

This table displays the score distribution for each criterion, e.g. "Audience". The score distribution for the school is compared to that of the DoE State.

Two test versions are applicable and the results are shown based on the Test Version selected. If no specific Test Version is selected (Prompt 1 or Prompt 2), scores of both the test versions are shown.

Each criterion may have a different maximum score. All possible scores for each criterion are shown.

The Criterion, Criterion Description, Score, Score Description, Syllabus ID, Syllabus Outcome, Teaching Strategy link, School % and State % are shown.

Student Writing Scores by Criterion

Use the slicers provided to filter the data from this section downwards.

  • Which scholastic year?
  • In a student group type?
  • A student group?

This chart displays student level writing score achieved, with a break down by each writing criteria. Hovering over the chart displays Student, Criteria, Score, Online Writing Outcome (i.e. maximum score). The chart allows to sort by Student Name or by Writing Score.

Student – Writing Response Details

This table displays student level Writing Response Details. Selecting a student from the chart above will filter this table to show only the selected student’s responses.

The table displays Student Name, Criterion, Score, Max Score, Criterion Description, Syllabus ID, Syllabus Outcome and Teaching Strategy link.

Student over time (online, non-writing)

Use the slicers provided to filter and further analyse the data:

  • What school were they in?
  • Which scholastic year?
  • In which calendar year?
  • In an enrolment type group?
  • An enrolment type?
  • What is the student’s name?
  • In a student group type?
  • Domain

This report provides information on:

Item Analysis by Assessment and response correctness

This chart compares a student’s item level performance (correct or incorrect) in the selected domain, in different Assessments by item difficulty. Each dot represents an item, green dots represent correct responses and red dots represent incorrect responses. Hovering over the dot displays Item ID, Item Difficulty, Assessment and Response Correctness.

Item Analysis by Syllabus

This chart displays student’s performance in items, compares number of correct vs incorrect items, grouped by Syllabus ID. Hovering over the chart displays Syllabus ID, Response Correctness and Number of Items. The chart is sorted by number of items by Syllabus ID, however, you can change the sort order.

Number of Questions Presented by Syllabus

This table displays the number of items presented to a student in the online testing pathway by syllabus code and syllabus outcome and also displays the total number of questions. This table allows to sort by any column.

Item Analysis by Item Descriptor

This chart displays student’s item level performance, compares number of correct vs incorrect items, grouped by Item Descriptor. Hovering over the chart displays Item Descriptor, Response Correctness and Number of Items. The chart is sorted by Item Descriptor, however, you can change to sort by Number of Items.

Student over time (online, writing):

Use the slicers provided to filter and further analyse the data:

  • What school were they in?
  • Which scholastic year?
  • In which calendar year?
  • In an enrolment type group?
  • An enrolment type?
  • What is the student’s name?
  • In a student group type?
  • A student group?
  • Domain

This report provides information on:

Score by Assessment Name and Writing Rubric

This chart displays the student’s writing score for each assessment by writing rubric. The chart displays the score in each rubric. Hovering over the chart displays Assessment, Writing Rubric and Score.

How can I use the report to support my school self-assessment?

This report can support schools in understanding the effectiveness of teaching strategies at the school level. It assists schools with the Assessment and Reporting, Student Performance Measures, Effective Classroom Practice, Data Skills and uses elements of the SEF.

What should I look for?

  • Items where there is a large difference in performance between the school and the DoE. Item Descriptor and Syllabus Outcomes where a student has performed significantly below the school average. These may point to gaps in the teaching strategy for that topic or skill.
  • Writing criteria where there is a large difference in performance between the school and DoE. These may point to gaps in the teaching strategy for that criterion/ writing rubric. Writing criteria where a student has performed significantly low in a criterion.
  • The latest three years (only from 2018 in case of writing) of NAPLAN Online data is provided. This allows teachers to analyse their current students’ responses in the prior NAPLAN assessment.

Where does this data come from?

NAPLAN.

How frequently is data updated?

Annually.

Return to top of page Back to top