The following checklist may assist with reviewing your Inspera assessment prior to students accessing and completing the assessment.

  • Is the content correct?
    • spelling errors
    • correct answers in MCQs or other automatically marked question types
    • instruction prompt for MCQ – change the default Select one alternative to either Select one or remove prompt altogether
    • instruction prompt for MRQ - change the default to Select ALL that
  • Does it read clearly? (to someone who was not the author)
  • Are instructions (within the question or task) to students clear and unambiguous?
    • Instead of starting a question with ‘what’ or ‘why’, begin with a verb that indicates what the student needs to do to respond adequately to that question.
  • Have we given all instructions to students – is there anything else they need to know to be able to fairly complete the questions? For example:
    • is pinpoint accuracy important in hotspot questions?
    • is it a Multiple Response Question?
    • for automatically marked SAQs, should students avoid abbreviations and ensure they use correct spelling?
    • will students get part marks on a question?
  • Are the marks allocated to each question correct?
    • the preference is for there not to be marks within questions unless this is absolutely necessary to the sense of the question.
    • do the marks add up to what you think they should?
  • Check all the links, images, videos, etc. display and work as intended.
  • Were the people testing the exam able to complete it in the set working duration?
  • If there are automatically marked short answer questions – please put in the answer that most naturally occurs to you, so that this option can be added (or discarded) to the correct options.
  • If there are manually marked short answer or essay questions - is the size of the box correct for how much you want students to write? Do you want to add a word count? If they add a word count, add an instruction for students e.g. Your response is limited to 100 words.
  • Any other observations.

For big question banks / meta data

  • Is every question titled properly?
  • Is every question labeled properly?

To schedule and activate the exam

  • Are the open and close times of the WINDOW correct?
  • Is the DURATION correct?
  • Have all the right contributors been added to the exam?
  • Are the marking committees correctly set up? (coordinator should be a Grader in Committees as standard)
  • Are the students enrolled correctly?
  • Have all students with extra time or other Approved Exam Adjustments been set up correctly?
  • Are the Design settings correct? Table of Contents, Hide show question titles, etc.
  • Are the Test Settings correct? Auto submission, submit once or multiple times etc.
  • Are the Assessment Settings (grading parameters) correct?