Note: You are currently viewing documentation for Moodle 3.7. Up-to-date documentation for the latest stable version of Moodle may be available here: Learning Analytics Enriched Rubric.

Learning Analytics Enriched Rubric

From MoodleDocs

gradingform-erubric-icon.png The Learning Analytics Enriched Rubric (LAe-R) is an advanced grading method used for criteria-based assessment. As a rubric, it consists of a set of criteria. For each criterion, several descriptive levels are provided. A numerical grade is assigned to each of these levels.

An enriched rubric contains criteria and related grading levels that are associated to data from the analysis of learners’ interaction and learning behavior in a Moodle course, such as number of post messages, times of accessing learning material, assignments grades and so on.

Using learning analytics from log data that concern collaborative interactions, past grading performance and inquiries of course resources, the LA e-Rubric can automatically calculate the score of the various levels per criterion. The total rubric score is calculated as a sum of the scores per each criterion.

You may click this link to view the Introduction to Learning Analytics Enriched Rubric video.


Version 3.0 release notes

The new version of LAe-R plugin embeds the following enhancements and characteristics:

  • Up-to-date coding according to Moodle’s latest code guidelines.
  • Code re-development for better performance of Data Mining and Learning Analytics generation.
  • The use of Tab key to jump to the next level/criteria and even to add new criteria or levels.
  • Criteria duplication for faster rubric creation.
  • Negative points can be assigned in levels, for example as a late submission penalty.
  • Improved rubric display for editing, viewing and grading, according to Moodle’s standard themes: Clean, (Boost) and (More) and responsive design for all devices (desktop – tablet – smartphone).
  • GDPR compliant

The new version was produced using Moodle 3.5 for testing with themes Clean, Boost and More as layouts.

Creating a new Learning Analytics Enriched Rubric

Selecting a Learning Analytics Enriched Rubric

There are two ways a user can choose a LA e-Rubric as an advanced grading method.

  • Make the selection during the creation of an assignment, in the Grade section of the creation form.

gradingfrom-learning-analytics-e-rubric-select1.png

  • Click Advanced grading in the settings block of the assignment and then make the selection from the Change active grading method to select form field.

gradingfrom-learning-analytics-e-rubric-select2.png

Editing a Learning Analytics Enriched Rubric

LA e-Rubric editor

In the Advanced grading page of the assignment, the user can

  • Define a new grading form from scratch or,
  • Create a new grading form from a template or,
  • Edit a current form definition

Either way, the grading form editor page appears where the LA e-Rubric can be created or edited.

In that form, the user provides a name for the LA e-Rubric, an optional description, adds or edits the criteria and chooses the options meeting his requirements.

Then the LA e-Rubric can be saved as a draft (for further editing), or saved and made ready for use.

Adding or editing criteria in a Learning Analytics Enriched Rubric

LA e-Rubric add or edit criterion
LA e-Rubric enrichment of criteria

In order to add or edit a criterion, the user can:

  • Add or edit the criterion description.
  • Add or edit the level description and points values.
  • Add or edit the enrichment criterion type (collaboration-grade-study).
  • Add or edit the enrichment collaboration type (simple occurrences, file submissions, forum replies, people interacted), in case collaboration is chosen as the criterion type.
  • Add or delete the corresponding course modules according to criterion type, from which data mining is conducted.
  • Add or edit the operator used for enrichment calculations between the enrichment benchmark found and level enrichment check values (equal-more than). This defines if discrete or continuous range values are checked for comparison operations.
  • Add or edit the checking scope of calculations according to individual student or all students.
  • Add or edit the level enrichment check values needed for setting the check points in comparison operations.

Before adding or editing the above form fields the user should consider the following:

  • To keep a rubric criterion simple, leave enrichment fields blank. Criteria enrichment is not mandatory!
  • In case of criterion enrichment all enrichment fields and level values must be edited.
  • A criterion type must be selected first in order for all other enrichment fields to be edited.
  • The criterion type defines the kind of course modules that will be included in the enrichment.
  • In case of collaboration check, the collaboration type field is available and mandatory.
  • The collaboration type defines what kind of checking will be made from the course modules.
  • Collaboration type posts & talks checks simple add post and talk instances from logs in the selected course modules.
  • Collaboration type file submissions checks number of files uploaded ONLY in selected forum course modules.
  • Collaboration type forum replies checks user(s) replies to posts ONLY in selected forum course modules.
  • Collaboration type people interacted checks the number of classmates a student has interacted with, in the selected course modules.
  • One or more course modules of the selected criterion type and of the particular course should be added.
  • The criterion operator is used for calculating logical conditions associated with the levels enrichment values.
  • Related to All (students) or One (student) defines whether the calculations will be according to the student evaluated, thus absolute values will be processed or according to all other students, thus percentages will be processed.
  • If the relation of the criterion is according to percentage, the arithmetic mean of all other students from all selected course modules will be considered as the students benchmark.
  • Level enrichment values should be ascending (or descending) according to level ordering, otherwise logical errors may occur during evaluation.

Checking options of a Learning Analytics Enriched Rubric

The following options can be checked while editing a LA e-Rubric:

Rubric options

LA e-Rubric options
  • Sort order for levels
    Sort level viewing according to grade points ascending or descending.
    Important: ordering of levels is taken into account in enrichment in order to pick the appropriate level according to enrichment check points. For example if level grade values are 0 – 10 – 20 – 30, enrichment check points should be ascending accordingly, for instance 5 – 6 – 7 – 8. Using this example, if the enrichment operator is more than (>=), the enrichment benchmark is calculated to 9 and the enrichment check points are 5 – 6 – 8 – 7, then 7 will be picked as opposed to 8!
  • Calculate grade based on the rubric having a minimum score of 0
    This setting only applies if the sum of the minimum number of points from each criterion is greater than 0. If ticked, the minimum achievable grade for the rubric will be greater than 0. If unticked, the minimum possible score for the rubric will be mapped to the minimum grade available for the activity (which is 0 unless a scale is used).
    If you are using a criterion without a 0-points level or with a level with negative points, then the rubric option 'When converting rubric score to points/scale assume that minimum number of points is 0' (new since Moodle 3.2) should be ticked to avoid unexpected grades.
  • Allow users to preview rubric used in the module (otherwise rubric will only become visible after grading)
    Checking this option, provides the student the ability to preview the LA e-Rubric before the student submits his assignment or been graded.
  • Display rubric description during evaluation
  • Display rubric description to those being graded
  • Display points for each level to those being graded
  • Display points for each level during evaluation
  • Allow grader to add text remarks for each criteria
  • Show remarks to those being graded

Enriched criteria options

  • Display enrichment check points for each level to those being graded
  • Display enrichment check points for each level during evaluation
  • Display enrichment of criteria to those being graded
    Un-check this option to hide enrichment of rubric criteria.
  • Display enrichment of criteria during evaluation
    Un-check this option to hide enrichment of rubric criteria.
  • Override automatic criterion evaluation in case of enrichment logical error (If enrichment logical error exists, evaluation is not possible without overriding it!)
    Check this option to enable the evaluator to pick a level according to his own judgment in case an enrichment benchmark is not found or there is a logical error in the enrichment criteria and a level can't be automatically picked.
  • Enrichment calculations are conducted from assignment available date (if enabled)
    If an availability date is defined for the assignment, check this option to time stamp enrichment calculations on data mining.
  • Enrichment calculations are conducted until submission due date (if enabled)
    If an due date is defined for the assignment, check this option to time stamp enrichment calculations on data mining.
  • Display calculated enrichment benchmark to those being graded
  • Display calculated enrichment benchmark during evaluation

Saving and Previewing a Learning Analytics Enriched Rubric

The user can save this form as a draft for further checking or save and make it ready to be used immediately. Either way, afterwards the user can preview the LA e-Rubric form as is was created or edited.

Video tutorial

You may click this link to view the tutorial on how to Create Criteria in Learning Analytics Enriched Rubric video.

Using a Learning Analytics Enriched Rubric to evaluate students

LA e-Rubric evaluation editor
LA e-Rubric evaluation explained

The grading process is where the Learning Analytics Enriched Rubric performs its magic. Analysis of data from log files is performed in order that all enriched criteria can automatically be evaluated and the corresponding criterion level gets a value. The evaluator can provide optional remarks, and just click ‘save’ or ‘save and grade next’, in order to grade a student.

First, the user clicks on View/grade all submissions in the assignment view page, or in the assignment's settings box. In the grading page of the assignment the user clicks on the grade icon, or chooses Grade from the editing icon in the edit column on the left.

Inside the evaluation form, the user sees all enriched criteria with the enrichment benchmark displayed and the appropriate level chosen for each one. If the enrichment evaluation procedure succeeded, in each criterion the user can see the checking icon of the enriched level whose value corresponds to the benchmark according to enrichment.

Handling enrichment evaluation failure

If the enrichment evaluation failed for an enriched criterion, the evaluator can pick a level according to his own judgment ONLY IF Override automatic criterion evaluation is enabled from the LA e-Rubric options. If there is a failure on enrichment evaluation and the evaluator can't pick a level himself, student evaluation won't be possible because all criteria will not have a level checked. In such cases it is strongly recommended to check the enrichment criteria again to avoid these errors, rather than override the enrichment evaluation procedure.

Evaluation according to student

If criterion enrichment evaluation is conducted according to student values, the student's benchmark appears upon succession in order for the evaluator to get the exact view of student performance.

Evaluation according to global scope

If the enrichment evaluation is conducted according to all students participating, then upon successful findings, the evaluator views two benchmarks. One that represents the student currently evaluated, and another for the score of all the participating students' average score. Again this is done in order for the evaluator to gain a better scope of student performance in reference to all participating students.

Something very important about global scope evaluation is that only students actively participating in the selected course modules of the enrichment are accounted for, which means that they may be, less than all students enrolled in a course. This is done for 2 reasons:

  1. Because the LA e-Rubric performs qualitative evaluation to students according to those participating, not to all. We want to measure true participation and collaboration results that concern active students only.
  2. Another equally important reason could be explained with an example: Let's say that we have 20 students attending a course and we want to evaluate them according to how much they collaborated the past week. Let's also say that 5 of them were sick the past week, or could not attend. It wouldn't be right for 5 students missing to bring down the hole week's average.

The above estimates are effective only for checking collaboration! For checking grades and studying, all enrolled students are accounted in the process.

Video tutorial

You may click this link to view the tutorial on how to perform Student Evaluation in Learning Analytics Enriched Rubric video.

How students view the Learning Analytics Enriched Rubric

Preview of a Learning Analytics Enriched Rubric

LA e-Rubric preview

If the corresponding option is engaged, students can preview the LA e-Rubric before they are graded. This is an excellent method to let students know how they are evaluated and get a better view of their evaluation criteria.

In order for the students to preview the LA e-Rubric they just click Submissions grading in the submenu of their assignment on the left.

View grading results produced by a Learning Analytics Enriched Rubric

LA e-Rubric view evaluation results
LA e-Rubric evaluation results explained

After graded, students can view how their evaluation occurred and they can also view their own benchmarks according to the LA e-Rubric criteria that affected their evaluation outcome.

The LA e-Rubric elements displayed to students, are defined in the LA e-Rubric options.

Students view their completed LA e-Rubric when the visit their corresponding assignment page.

Video tutorial

You may click this link to view the tutorial on how students View Evaluation Results in Learning Analytics Enriched Rubric video.

Backup & restore, template sharing and importing a Learning Analytics Enriched Rubric

Procedures concerning backup, restore, import or template sharing are carried out according to all advanced grading methods of Moodle.

However, regarding the LA e-Rubric there are some restrictions.

The LA e-Rubric uses specific and resident course modules belonging to the moodle course in which the assignment is created. Thus when a LA e-Rubric is restored or imported or shared in another course, the particular course modules won't exist. The structure of the entire LA e-Rubric stays intact, but the user has to replace the missing course modules with similar ones obtained by the new course.

When an entire course is restored, the expected scenario is that most course modules have been given a new id, thus this restriction may still be in effect. Again, the structure of the entire LA e-Rubric stays intact, but the user has to replace the missing course modules with the same ones obtained by the restored course, in order to update the course modules ids.

During the sharing procedure of a LA e-Rubric, user gets an information message concerning this restriction.

If the user imports or restores or uses a LA e-Rubric from another course, another message appears informing the user about the course modules missing from the enriched criteria and advises him to make the appropriate changes so that the LA e-Rubric may be operational.

Images below display all these messages.

LA e-Rubric missing course modules error
LA e-Rubric form sharing warning
LA e-Rubric edit with errors warning

Grade calculation and Data mining for enrichment in a Learning Analytics Enriched Rubric

Grade calculation

Grade calculation is done the same way as in simple rubrics.

The rubric normalized score (ie basically a percentage grade) is calculated as

where is the number of points given to the i-th criterion, is the minimal possible number of points for of the i-th criterion, is the maximal possible number of points for the i-th criterion and is the number of criteria in the rubric.

If you are using a criterion without a 0-points level or with a level with negative points, then the rubric option 'Calculate grade based on the rubric having a minimum score of 0' should be ticked to avoid unexpected grades. Then, the rubric total score is calculated as

where is the number of points given to the i-th criterion and is the maximal possible number of points for the i-th criterion and is the number of criteria in the rubric.

Let's examine a simple rubric with 1 criterion and 4 levels as bellow:

Example 1: simple rubric with 1 criterion containing 4 levels
level 1 level 2 level 3 level 4
criterion 1 1 point 2 points 3 points 4 points

The example above has the second level with 2 points checked. According to the above calculating methods, the first method resolves the normalized score as:

According to the second method the normalized score is:

So, for someone who build the above rubric presuming that a student who receives 2 points in a 4 point scale should be graded with a score of 50%, the second method meets their assumption.

For a second example we add another level in the previous rubric with a 0 points level like so:

Example 2: simple rubric with 1 criterion containing 5 levels with 0 points minimum score
level 1 level 2 level 3 level 4 level 5
criterion 1 0 points 1 point 2 points 3 points 4 points

According to the first method the normalized score is:

According to the second method the normalized score is:

So, if the rubric's lowest possible score is zero (0), both methods calculate the same score result.

A more complex rubric example where both methods calculate the same score result (because the lowest possible score is zero (0)), is the one below:

Example 3: complex rubric with 4 criteria containing 5 levels with 0 points lowest possible score
level 1 level 2 level 3 level 4 level 5
criterion 1 (simple criterion) 1 points 2 point 3 points 4 points 5 points
criterion 2 (double weight criterion) 2 points 4 point 6 points 8 points 10 points
criterion 3 (penalty points criterion) -3 point -2 points -1 points 0 points 1 points
criterion 4 (simple criterion) 0 point 1 points 2 points 3 points 4 points

According to the first method the normalized score is:

According to the second method the normalized score is:

Learning Analytics for enriching the grading method

The data acquired during the log file analysis are distinguished according to analysis indicators as presented in these cases below.

  • For simple occurrences in collaboration, moodle log data concern forum add posts and chat talks.
  • For file submissions in collaboration, the number of files attached to forum post messages.
  • For forum replies in collaboration, forum reply post messages are counted (not including the replies one has made to himself).
  • For people interacted, forum post and chat messages data are measured.
  • For checking study behavior, the number of students' views upon selected course recourses are taken into account.
  • For checking grades, moodle grading scores on selected assignments are processed.

General advices - tips – instructions

  • Create rubrics with an odd number of levels (3 or 5), so students can grasp the point of low - mid - high assessment values.
  • Use double weight criteria to signify important lesson objectives.
  • Use penalty points (e.g. -1) in levels in moderation, or not a all (use positive reinforcement rather than negative punishment).
  • Use criteria with 2 levels with definite description (e.g. YES - NO) to clarify simple yet important lesson objectives (e.g Student always submits assignments in time).
  • Rubric criteria doesn't have to have levels of the same description or number for each criterion. Moodle rubrics are flexible and adjustable. Use that to meet your assessment needs!
  • An enriched-Rubric can have simple criteria. They don't all have to be enriched. The difference is that simple criteria must be manually picked by the evaluator during assessment.
  • Use the keyboard's tab key or the new copy criterion button gradingform-erubric-copy-icon.png to create rubrics quickly.
  • First create all course resources and activities and then generate a LA e-Rubric.
  • Create enrichment criteria carefully and thoroughly to avoid logical errors.
  • Don’t delete course resources or activities used in a LA e-Rubric.
  • Log data are needed for evaluation so don't purge or empty Moodle data logs, until students grading is complete.
  • If there is a request for user assessment data deletion according to GDPR regulations, best delete these data after the course is completed and student grades have been published.

GDPR Compliance

This plugin is now GDPR compliant. Users with appropriate access can view, download or delete all user assessment data that is stored by this tool. Even if assessment data are deleted, students' grade will not be affected in the gradebook, but the graded rubric will be blank.

Future improvements

Future improvements may be done in order to:

  • Visualize Learning Analytics with graphs and charts during evaluation for each criterion.
  • Export evaluation outcomes to various formats.
  • Import LA e-Rubrics from known rubric creation tools.
  • Provide default rubric templates for faster rubric creation.

See also

Moodle Docs

Wikipedia

Academic documents

Youtube channel