Quiz statistics report

From MoodleDocs

The quiz statistics report may be viewed by clicking the Quiz and then from Quiz navigation > Results, selecting Statistics from the dropdown menu, or if using a non-Boost based theme, accessing Administration > Quiz administration> Results > Statistics

This report gives a statistical (psychometric) analysis of the quiz, and the questions within it. The top section of this report gives a summary of the whole quiz. The next section gives an analysis showing all questions in a table format. There are links in this section to edit individual questions or drill down into a detailed analysis of a particular question. The last section of this report is a bar graph of the percentage of correct answers (Facility index) and the Discriminative efficiency index.

The full report (overview, and detailed analysis of all questions) can be downloaded in a variety of formats, as can the quiz structure analysis table.

Reasons to look at the statistics report

The Quiz statistics report provides something for both the data scientist or mathematician as well as the everyday teacher that can help them learn about how well the quiz is helping students.

Finding 'broken' questions

If everything is working well the students who achieve a high mark on the whole quiz are more likely to get each question correct. If a mistake was made when creating a question then this pattern may be disrupted. For example, when configuring a multiple choice question the correct response option was to be incorrectly configured to have a zero mark, while a distractor (wrong response option) was to be set to earn full marks. In this scenario the better students who achieve a higher overall quiz mark are more likely to select the correct answer, however due to the incorrectly configured question will get marked wrong on that question. This type of problem may show up in the 'Discrimination index' column as a small number. However, a broken question is not the only cause of a low Discrimination index, therefore further investigation is required. Moodle will highlight low values in the Discrimination index column with a red background.

Tip: look down the 'Discrimination index' column. If any numbers there are small, then investigate that question.

If you find a 'broken' question, there are various options. You may be able to edit the question and then regrade the quiz to fix the problem (but be very careful editing questions after they were attempted.) Or, you could to go to the Edit quiz page, and set the mark for that question to 0 (zero-weight it).

Ensuring that random variants are fair

There are lots of different ways that randomization can be used when building a quiz. In this instance we want to consider the use of similar versions of the 'same' question. When using formative quizzes this provides students more chances to practise. While for summative quizzes the use of question variations can reduce the opportunity for cheating by collusion. Setting up variations can be done by using random selection from a category such as 'Variants of question 1', or variations of a question can be created using question types such as Calculated or STACK that have internal random variants, or both. (For more details, see here.)

If a quiz is being used for summative assessment it is particularly important to ensure that all variants of a question are 'fair'. It is undesirable for one variant to be much harder than the others. To investigate the question variants examine the subsidiary rows that show all the different questions that appeared in once particular place in the quiz. Look in particular at the 'Facility index', which indicates how many people got that question right. If those numbers are very different for different variants, that is a sign the variants may not be of equal difficulty, therefore leading to an unfair assessment.

Tip: if using variants, look down the 'Facility index' column to make sure that all the different variants of a particular question have a similar facility index.

If there is a large variation in one question, then you could zero-weight it. Also, if you are re-using the quiz in future, consider removing that variant and adding a new one. The Discrimination index should also be checked for all the different variants to ensure that none of the variants are broken.

Understanding how students are responding to a particular question

When you set a particular question in a quiz, you normally do so intending to test a particular skill or element of knowledge. However, can you be sure that is what you are really evaluating?

As an example from this, there was an Open University (OU) 'Maths for Scientists' course, where they had a question which was supposed to test that the student could substitute some numbers in an equation, and compute the answer with their calculator. As is standard practice for OU science tests, students had to give the answer to a specific number of decimal places (which had already been taught and assessed). However, when they look at the answers students had given, they found that many students had clearly computed the correct response in their calculator, but were then being marked wrong because they had input too many decimal places in the question response field. (See http://oro.open.ac.uk/39669/ Jordan, 2014). In this case the question was no longer just a measure of the student's ability to substitute and compute the correct answer but it also became an unintended test of their ability to follow a rule related to the number of desired decimal places. This unintended problem could have been avoided if the question was configured to allow for a greater number of decimal places in the expected numeric response or the question instructions may need to more clearly specify the expected number of decimal places for the response.

The statistics report enables the analysis of students' responses to a question. By clicking through to the details about a particular question, at the bottom of that page it shows all the different responses that were submitted, whether each was marked correct or incorrect, and how many students gave each response.

This can be a good place to start looking if some of the checks above highlight a potential problem.

Another way you can use this information is, if you are using more open-ended question types, such as Short answer or Numerical, then you may notice several students have made the same mistake. You might then go back and edit the question to add specific feedback for students who have given that answer.

Tip: particularly if using open-ended questions, look at the Analysis of responses to see the range of responses that are being submitted by students.

Overall quiz statistics

Here are the details concerning information in this report.

Quiz information

This section gives some basic information about the test as a whole. You will see:

  • Quiz name
  • Course name
  • Open and close dates (if applicable)
  • Total number of first/graded attempts
  • Average grade for first/all attempts
  • Median grade
  • Standard deviation of grades
  • Skewness and Kurtosis of the grade distribution
  • Coefficient of internal consistency (sometimes called Cronbach Alpha) - This is a measure of whether all the items in the quiz are testing basically the same thing. Thus it measures the consistency of the test, which is a lower bound for the validity. Higher numbers are better.
  • Error ratio - the variation in the grades comes from two sources. First some students are better than others at what is being tested, and second there is some random variation. We hope that the quiz grades will largely be determined by the student's ability, and that random variation will be minimised. The error ratio estimates how much of the variation is random, and so lower is better.
  • Standard error - this is derived from the error ratio, and is a measure of how much random variation there is in each test grade. So, if the Standard error is 10%, and a student scored 60%, then their real ability probably lies somewhere between 50% and 70%.
Example of quiz information section

Quiz structure analysis

This section lists all the questions in the quiz with various statistics in a table format.

  • Q# - shows the question number (position), question type icon, and preview and edit icons
  • Question name - the name is also a link to the detailed analysis of this question (See Quiz Question Statistics below).
  • Attempts - how many students attempted this question.
  • Facility Index - The average mark (as a percentage) obtained on the question. A higher value normally indicates an easier question.
  • Standard Deviation - how much variation there was in the scores for this question.
  • Random guess score - the score the student would get by guessing randomly.
  • Intended/Effective weight - Intended weight is simply what you set up when editing the quiz. If question 1 is worth 3 marks out of a total of 10 for the quiz, the intended weight is 30%. The effective weight is an attempt to estimate, from the results, how much of the actual variation was due to this question. So, ideally the effective weights should be close to the intended weights.
  • Discrimination index - this is the correlation between the score for this question and the score for the whole quiz. That is, for a good question, you hope that the students who score highly on this question are the same students who score highly on the whole quiz. Higher numbers are better.
  • Discriminative efficiency - another measure that is similar to Discrimination index.
Where random questions are used, there is one row in the table for the random question, followed by further rows, one for each real question that was selected in place of this random question.
When quiz questions are randomized for each quiz, the quiz module determines a default position.
The Quiz statistics calculations article provides further details on each of these quantities.
Example of statistics structural analysis section

Quiz statistics chart

Chart example

Quiz question statistics

Navigation > quiz's name > Results > Statistics (click on any question title) It is possible to see the statistics for one question on a single page. This view will also tell you what percentage of quiz takers selected each answer (Analysis of responses) and give you basic information about the question.

  • Question information- The basic information about the question, the name of the quiz, the question, the question type, the position in the quiz and the question itself. There are preview and edit icons in this page.
  • Question statistics - This repeats the information from the table row from the Quiz structure analysis that relates to this question.
  • Report options - You can choose whether to run the report on all attempts, or just the first attempt by each student. Some of the calculations used in the report are based on assumptions that may not apply to quizzes that allow more than one attempt.

:Tip: Computing the statistics takes some time, the report will store the computed values and re-use them for up to 15 minutes. Therefore, there is a display of how recently the statistics were calculated, with a button to recalculate them immediately.

Analysis of individual question responses

This gives a frequency analysis of the different responses that were given to each part of the question. The details of the analysis depends on the question type, and not all question types support this. For example, essay question responses cannot be analysed.

Individual question's responses statistic information

See also