Student Item Analysis Non-Writing Online

The Student Item Analysis Non-Writing - Online report shows how a student has performed by NAPLAN item.

This report includes results only for the students that have participated in the NAPLAN test online. Paper results are reported on a separate report. This report includes all online NAPLAN domains except Writing.

This report is available to school-based staff and network directors.

How does online tailored testing work?

NAPLAN Online uses a feature known as tailored testing, where the test automatically adapts to the students’ performance and asks the questions that match the students’ achievement level. All students of each year level start with questions of the same level of complexity. The computer system scores the students’ answers.

The student then progresses to the next testlet. The next testlet includes the questions that may be easier or more difficult than the prior testlet depending on the students’ performance. There are up to three testlets per domain.

The student is not exposed to all items, but is exposed to only the items in their tailored testing pathway.
Refer below for more details:

http://www.nap.edu.au/docs/default- source/default-document-library/tailored- testing-faq.pdf?sfvrsn=2

How will this report benefit me?

The report allows schools to analyse the NAPLAN performance of individual students, for each item descriptor of the assessment. Used in conjunction with the school-level report, it can indicate which students may need additional help with specific topics or skills.

What does the report provide?

This one page report has information on:

Item List - Student

  • This table displays each item in the selected assessment and domain that the student was exposed to in their tailored testing pathway.
  • The Item Descriptor shown is not the actual item text, but the outcome expected of that item.
  • Skillset - To assist in organising items, items have been grouped into skillsets by item descriptor. Item descriptors were used to group skillsets as all items were not released by ACARA (see ‘Released’ definition below). Each skillset within a domain is derived from a group of similar skills that are conceptually aligned. For example, in Reading, vocabulary is a distinct skillset that requires students to identify, explain or analyse vocabulary used in a text. Note, because items with the same Item descriptor may have had a range of levels of item difficulty, skillsets may also cover a range of content complexity.
  • Response indicates if a student’s response to an Item presented in the Student’s pathway is Correct or Incorrect.
  • The Item Band is shown. The band assigned to an item indicates that students who achieved at that band for their overall score and above, would have been likely to get this item correct. Students who achieved below this band would have been likely to get this item incorrect.
  • The Item Difficulty is shown. The Item Difficulty assigned to an item indicates that students who achieved at that Scaled Score and above, would have been likely to get this item correct. Students below this scaled score would have been likely to get this item incorrect.
  • Testlet is based on the tailored testing pathway with pathways labelled A to F as shown in the link below. Each pathway is given a number to reflect a parallel form of the pathway. Students are randomised to parallel forms of the pathway which has items of equivalent difficulty. http://www.nap.edu.au/docs/default- source/default-document-library/tailored- testing-faq.pdf?sfvrsn=2
  • Released information indicates whether the full Item content is released by ACARA. Only a limited percentage (approximately 25%) of actual Questions/ Items are released publicly. When not released, a link to an Exemplar Item, an Item of similar content, rather than the actual Item is presented. Note that these Exemplar Items have the same Item Descriptor as the Item presented to the Student, however the Item Difficulty may vary.
  • Item Type is shown indicating the type of the item selected to match the skills, knowledge and understandings being assessed in the testlet pathways. HT – Hot Text, ET – Extended Text, MC – Multiple Choice, IGGM
    – Interactive Graphic Gap Match, IGM – Interactive Gap Match, TE –Text Entry, CO – Composite, IM – Interactive Match, HS – Hotspot, MCS – Multiple Choices.
    Refer below for more details: https://schoolsnsw.sharepoint.com/sites/HTLSRYAG/Shared%20Documents/QRG/Item%20Types.pdf
  • Click on a single item in the table to view details for that item in the Item Details table below.

Item Details

  • This table displays details for an individual Item Descriptor selected in the Item List - Student table above.
  • Syllabus ID, Syllabus Outcome corresponding to the item are shown in the table.
  • Links to the Curriculum, Exemplar and Teaching Strategy are shown.

Use the slicers provided to select a student and further analyse the data:

  • Assessment Year
  • Executive Director Group
  • Network Name
  • School Name
  • Enrolment Type Group
  • Enrolment Type
  • Cohort Scholastic Year
  • Student Name
  • Assessment
  • Domain – Grammar & Punctuation, Numeracy, Spelling and Reading.
  • Student Band Achieved
  • Item Band
  • Gender
  • Aboriginal
  • EAL/D
  • Participation Type

How can I use the report to support my school self-assessment?

This report can support schools to understand the effectiveness of teaching strategies at the school level. It will assist schools with the Student Performance Measures, Effective Classroom Practice and Data Skills and Use elements of the SEF.

What should I look for?

  • Item Descriptor and Syllabus Outcomes where a student has performed significantly below the school average. This may indicate that the student requires additional assistance with those topics or skills.

Where does this data come from?

NAPLAN

How frequently is data updated?

Annually

Return to top of page Back to top