Note:

If you want to create a new page for developers, you should create it on the Moodle Developer Resource site.

Moodle Automatic Regression Test and Inspection Nit-picker

From MoodleDocs

Recently, thanks to Jordan and Eloy, we have accumulated a number of useful automated tests of the Moodle code at http://moodle.org/cvs. They are

  • Unmerged files list,
  • Code syntax check, and
  • Unit test and code coverage results.

This article proposes another addition to this suite that I think will give us a lot of bang for very little buck - especially as I am hoping that Jordan can implement this, so it is no effort for me;-)

Separate automated tests from CVS admin

However, first I would like to propose one other chage. This growing list of automated tests is starting to make the CVS admin interface a bit cluttered. I suggest separating CVS admin from the automated tests. The title of this article is my proposed name for our automated test suite ;-).

Now on to the main proposal

Automated regression tests

The idea is that you set up your site by doing a fresh install, then restoring a copy of the Moodle features demo course, and then turning on debugging and performance logging.

Then there is a simple script that takes a list of URLs like

  • course/view.php?id=2
  • mod/forum/view.php?id=12
  • ...

(basically as complete a list of URLs as we can think of) and, having logged in with a test user account, it requests each page and captures the result.

For each of those pages, it just does some very simple tests:

  1. Try to parse the respose as XML. Is it well-formed?
  2. Check the PHP logs to make sure there are no errors/warnings or notices generated by this page.
  3. Feed the response to a HTML validator.
  4. Extract the performance info (number of DB queries, load time, etc.) for the page.

All this data gets stored somewhere, either DB or in files on disc. Then it is displayed with the other automatic test results.

In the first instance the first two tests are the most useful. The others could be added later, and I am sure that it time we could things of yet more tests.

What this proposal is not

Note that this is not the same as the kind of sophisticated testing you can do with a tool like Selenium or JMeter. We are not testing that Moodle does the right thing, just that it does something without obvious errors. However, this will still catch the majority of developer screw-ups with minimal effort so it is worth doing.

Also, we can only test pages fetched with GET requests, not the result of submitting forms. However, as I just said that does not stop this being a useful and easy to implement idea. Nor does it preclude us implementing more sophisticated types of automated testing in future.

Possible future enhancements

A. I think the list of URLs to check should be stored within each plugin. So, I propose the list is stored in files like:

  • admin/simpletest/urllist.txt
  • mod/quiz/simpletest/urllist.txt
  • grade/report/grader/simpletest/urllist.txt
  • ... and so on ...

B. Some pages look different based on your role. Therefore, I suggest that the script has a number of different accounts it can log in as (say admin, teacher and student), and the format for the urllist.txt file looks like:

A T S index.php
A     admin/index.php
A T   course/edit.php?id=2

or something like that.

C. As well as having this run on a central server, it woudl be nice if developers could also use it on their own machines. (For example a developer wanting to test their custom plugin.) Therefore, we cannot assume that the ids in the URL are what you would get by doing a clean install, and then restoring the Moodle features demo. Therefore we need some sort of variable substitution in the URLs in urllist.txt.

D. Obviously pretty graphs of the historical perfomance data are essential. As are the ability to sort and browse the results with links to cvs.moodle.org, and download of tables of data as spreadsheets.

Summary page

I think we should also make a one-page summary of all the automated tests. For example, for the unit tests we would have the green or red bar. For Unmerged files and Syntax check we would either have a Green 'All OK' or a red 'N files have problems' message. This summary page would like through to the detailed results.

Being silly, on the summary page I think we should have a photo of Martin, either grinning and giving a thumb's up signal if all is well, or frowning (or Grrrr!ing perhaps?) if there are problems.

See also