Testing for reproducibility is an important principle behind Madagascar’s design. It works on several levels.
- Inside a project directory (with SConstruct file that contains from rsf.proj import *), run
to create Result files (figures) and to copy them to a different location (specified by RSFFIGS or $RSFROOT/share/madagascar/figs by default). Papers included with Madagascar (under $RSFSRC/book) have their result figures saved in a repository.
To come back and test if the results are still reproducible, run
This command performs an intelligent comparison of figures using Joe Dellinger’s sfvplotdiff and reports an error if the figures are different. In the case of an error, you can run
to flip between the new version of the figure and the old version and on the screen and to compare them visually. Based on that comparison, you can either “lock” the new version with
or debug the error that caused the difference and try to fix it.
- To test all projects where a particular program, say sfspike, is used, run
This is useful for regression testing for changes in programs that may cause reproducibility failures. You can also run
to test all projects and all Madagascar programs. By default, testing is limited to projects that use only publicly available data and less that 1 Mb of disk space. This behavior can be changes by giving all=y or size= parameters to scons test.
- A collection of scripts developed by Jim Jennings and explained on the Automatic Testing page performs fine-grain testing with extended diagnostics.