Inspera analytics (Original)
Note: Users need to have the Reporter role to view the Inspera Learning Analytics.
Inspera released Learning Analytics in Beta to gain targeted experience within data usage in combination with all the different test types that are possible to create within Inspera Assessment. There are also a few notable items in the current version:
- Older tests can have missing or less accurate calculations of time spent per question, this and derived values could therefore be wrong.
- The usage of advanced scoring rules, such as negative marks, marks per alternative, threshold values on questions, and bands and criteria, can affect the values in such a way that they can be hard to interpret.
- Questions with manual scoring can be left unmarked by a marker. In these cases, values such as average score, P-value, and correlation will not be correct.
The best approach for solving the above-listed issues is something we want to investigate further before a general release.
Question analytics
- Log in to Inspera Assessment (https://uqi.inspera.com/admin).
- Select Question from the Author drop-down menu.
All questions in your bank will be displayed.
- Navigate to the left side Default Views menu and click Analytics.
- Click on the Filters button.
- Click on the More filters button.
A list of all available Learning Analytics - data for each question in a test will be displayed.
Variable | Description |
Attempted | The number of students who answered the question. |
Average time spent | The average number of seconds spent on the question among the students who have seen the question. |
Average score | The average score on the question among the students who submitted this test. |
Correct | The number of students who answered the question correctly. |
Correlation | A number between -1 and 1. The correlation is the extent to which the question score correlates to the total score. Negative or very low positive values may indicate that the question does not discriminate (differ) well between students of high and low ability. See here for additional documentation: http://https://en.wikipedia.org/wiki/Point-biserial_correlation_coefficient. |
Exposed | The number of students, among those who submitted this test, who opened the question. Equals the sum of "Attempted" and "Omitted". |
Max score | The maximum score on the question. |
Not exposed | The number of students who did not open the question. |
Omitted | The number of students who opened the question without answering it. |
P-value | The P-value is the normalized average score on a question. This means that the maximum value of the P-value is 1, which happens if all the candidates answer the question correctly. It is worth noticing that the P-value can differ from the average score, even if the question has a maximum score of 1 because the P-value only takes into account the number of candidates that were exposed to the question. |
Question order | The position of the question in this test. |
Note: If you use advanced scoring rules with negative scores, it may be difficult to interpret this value.
Question usage across tests
The stats in the table view are per test only. To filter on a particular test, you know the question has been used in, use the list filter and filter on the name of the test
If the same question has been used across several test events the stats will be replaced by "Multiple Values".
To look at all the tests the question has been used in, under Question No., click Multiple values and a list of all the test-IDs will be shown.
- Copy the ID in parenthesis and open the URL:
- https://<subdomain>.inspera.no/admin#deliver-test/<question ID here>
- This will open the test where the question was used. It is then possible to filter on that test to get the results from that particular test.
Ex. https://demo.inspera.no/admin#deliver-test/23099404
Question set analytics
To access the question set analytics select Questions Sets from the Author top-menu then click either Analytics-Basics or Analytics-Advanced.
- Log in to Inspera Assessment (https://uqi.inspera.com/admin).
- Select Question Sets from the Author drop-down menu.
- Click on either the Analytics-Basics or Analytics-Advanced button.
- Click on the More filters button.
A list of all available metrics - Question Set Analytics will be displayed.
Variable | Description |
Average total score | The mean score of the total scores achieved by all students who took the test. |
Average total score normalized | The mean score of the total scores achieved by all students who took the test, divided by the max score. |
Average time spent seconds | The average time that was spent on completing the test, in seconds. |
Cronbach Alpha | Cronbach Alpha is a measure of internal consistency of the test, calculated with the formula: https://en.wikipedia.org/wiki/Cronbach%27s_alpha |
Highest total score achieved | The highest score that was achieved on the test. |
Lowest total score achieved | The lowest score that was achieved on the test . |
Maximum achievable total score | The highest score that can theoretically be achieved on the test (sum of all max scores for all questions included in the test). |
Number of none attempts | The number of students who did not attempt any of the questions in the question set (i.e. no answer registered for any of the questions). |
Number of questions | The total number of questions that were included in the test. |
Number of test-takers | The total number of students who took the test. |
Number of withdrawals | The total number of students who withdrew from the test. |
Percentage not reached | The percentage of students that did not complete the entire test. |
Standard deviation of the total score | The standard deviation of the total score achieved by all students who took the test. |
Total score 1st quartile | The value of the total test score associated with the 1st quartile. |
Total score median | The value of the total test score associated with the median. |
Total score 3rd quartile | The value of the total test score associated with the 3rd quartile. |
- Academic Integrity & Assessment Security in Inspera
- Access Inspera
- Access your Inspera test for marking
- Add graders to an Inspera test
- Add late-enrolled students to Inspera assessment CSV
- Add media content to questions in Inspera
- Add one-time users to an Inspera test
- Add the Assumption and Queries question in an Inspera exam
- Adding Resources (pdf files, links) to a Question Set
- Adding staff (contributors) to an Inspera assessment
- Adding the Academic Integrity Statement to your assessment
- Additional pages required for Inspera exams
- Alternative method for downloading Final Marks from Inspera
- Answer key corrections - MCQ
- Apply Alternative Exam Arrangements (AEAs) and Time Zone Adjustments in Inspera
- Assign questions to graders in Inspera
- Complete an Inspera test as a student
- Confirm grades in Inspera
- Copy a question set from Inspera Training to Inspera Admin (Original)
- Create Inspera practice exam with Safe Exam Browser
- Create an Inspera submission link in Learn.UQ
- Create an Inspera test in Deliver
- Create bands and criteria
- Create marking committees in Inspera
- Create marking committees using CSV
- Creating Questions
- Downloading responses to Assumptions and Queries
- Edit question weight in Inspera
- Enable After-test settings in Inspera
- Enrol students in Inspera test using CSV file
- Explanations on student responses
- Export questions from Blackboard to Inspera (Original)
- Exporting a Question Set to PDF
- False Start
- Filter functionality in Inspera marking
- Flag students in Inspera test
- Getting started with Inspera
- Grading workflow for Planners and Graders
- Incident Adjustments for students who experience technical delays
- Inspera Assessment Design Settings
- Inspera Assessment Environments and their Purposes
- Inspera Assessment User Roles
- Inspera Assessment access methods for students
- Inspera Exam Requests
- Inspera Grade Workspaces
- Inspera Observed User Testing
- Inspera Question Set Version Control
- Inspera Recommended Assessment - Standard (non-exam) assessment - webpage
- Inspera Rubrics
- Inspera School-based Exams
- Inspera Test settings
- Inspera analytics
- Inspera central on-campus and off-campus exams
- Inspera marking navigation
- Inviting students to an assessment via Test Code
- Late submissions and extensions in Inspera Assessment
- Manually marked questions: Mark and feedback
- Monitor Assessment
- Name and label questions in Inspera Assessment
- Navigate Inspera
- Override scoring of questions in Inspera
- Pilot an Inspera assessment
- Question sets in Inspera Assessment
- Sections in Question Sets
- Sharing a question set in Inspera Assessment
- Student Arrives Late
- Supporting students to use Inspera Assessment
- Things to look for in review
- Transfer results from Inspera to your Learn.UQ course
- Turnitin similarity report in Inspera
- View student responses in Inspera