CASA Guides Test

CASA has a set of tutorials known as CASA Guides, these guides contains actual python code intended to be executed manually by using CASA interactive console. In order to make sure that the guides code is works with the latest CASA version, is needed to automatically test (verifying and validate) the code snippets, e.g.: EVLA_3-bit_Tutorial_G192

==

The approach

The approach is basically is a two steps process: the extraction and the merge.

The extraction consist of parsing the guide website to write just the code snippets and the identifier to a generated python script. The identifier is just a python commented out line with a "In CASA: " prefix followed by a key-phrase , e.g: "# In CASA: listobs on the initial data set", this identifier is a non-unique set of words, defined in the website guide code snippets, basically to identify which test should be generated for the code snippet, e.g.:
# In CASA: listobs on the initial data set
listobs('G192_6s.ms', listfile='G192_listobs.txt'

The merge process will parse the generated python script, reading the key-phrases in order to generate the test method, classes and suites based on an already defined template by mapping the key-phrase to a test class method. The latest is achieved by using a specified template with conditionals for the key-phrases when is needed patch the guide code (e.g.: set interactive argument from true to false) an example is the guide_EVLA3BitTutorialG192.template.

The merge process will generate by default a regression and helper modules.

Generated modules

An example is used as reference, check the source code at casa-testing.

Regression module

Actually, this module is the executable pytest class generated from the guide template (guide_EVLA3BitTutorialG192.template), the module defines the helper, post and patch module names, post and helper must exists, patch module depends on if is defined in the template, e.g.: guide_EVLA3BitTutorialG192.template.

module_helper = "guides_helper_EVLA3BitTutorialG192"
module_post   = "guides_post_EVLA3BitTutorialG192"
module_patch  = "guides_patch_EVLA3BitTutorialG192"

The reason that helper and post module must exists is because they are used for verification (helper) and validation (post ), these are executed in a top-down order.

@injectMod(module_helper)
@injectMod(module_post)
def test_00_splitting_fields_for_analysis(self):
    """test 00 EVLA_3-bit_Tutorial_G192 "splitting fields for analysis"
    """
    pass

As an example, the generated module regression_EVLA3BitTutorialG192.py (has just a few test methods for learning purposes, a generated one will contain the full list generated methods) was based on guide_EVLA3BitTutorialG192.template. Bear in mind that the method names mapped to the key-phrases have a prefix test_##, this is done in this way to execute the methods in a sequential, ordered and expected way.

Helper module

Is a plain module generated with the actual guide code, it tests the code in a functional way: a verification, e.g.: the following snippet belongs to guides_helper_EVLA3BitTutorialG192.py:

@injectEnv
def test_00_splitting_fields_for_analysis():
    """ "splitting fields for analysis"
    """
    casalog.origin("test_00_splitting_fields_for_analysis")
    casalog.post("starting")
    split('TVER0004.sb14459364.eb14492359.56295.26287841435.ms', outputvis='G192_6s.ms', \
    datacolumn='all', field='3,6,7,10', keepflags=False, spw='2~65')

The @injectEnv is a in-house developed python method decorator which act as a dependency injection, basically injects the needed python libraries defined at runtime by getting the dictionary variables from the strack frame of IConsole.

Non generated modules (needed)

Post module

This module is developed by the tester, by default the method to execute is mapped by the same name of the regression module method executor, e.g.: test_00_splitting_fields_for_analysis exists in regression and helper modules. This module is used to execute the validation in automated-unattended way, the validation depends on the test executing during verification, e.g.: assert that a file exists.

def test_17_spectral_information():
    """post method for "spectral information"
    """
    measet = "%s/G192_flagged_6s.ms/SOURCE" % os.getcwd()
    checksum_ref = ""
    setjy_common(measet, checksum_ref)

The example above checks that the generated file content checksum is the same compared with an already valid pre-computed checksum.

Patch module

This module is developed by the tester, the only method implemented are the ones using the patch module (defined in the template). The intention of this module is to modify code that isn't impossible to test it in an automatically way, e.g.: when is needed not-reproducible human interaction.

The test_58_basebands_mfs_taylor_cleaning use the argument interactive=True, this method was generated (parsed from the website) into the helper module.

@injectEnv
   def test_58_basebands_mfs_taylor_cleaning():
   """ "basebands mfs taylor cleaning"
   """
   casalog.origin("test_59_basebands_mfs_taylor_cleaning")
   casalog.post("starting")
   # Removing any previous cleaning information
   # This assumes you want to start this clean from scratch
   # If you want to continue this from a previous clean run,
   # the rm -rf system command should be be skipped
   os.system ('rm -rf imgG192_6s_spw0-63_mfs2*')
   clean(vis='G192_split_6s.ms', spw='0~63:5~122', \
      imagename='imgG192_6s_spw0-63_mfs2', \
      mode='mfs', nterms=2, niter=10000, gain=0.1, \
      threshold='0.0mJy', psfmode='clark', imsize=[1280], \
      cell=['0.015arcsec'], \
      weighting='briggs', robust=0.5, interactive=True)
   #
   mystat = imstat('imgG192_6s_spw0-63_mfs2.residual.tt0') + ' Jy'
   print 'Residual standard deviation = '+str(mystat['sigma'][0])
   myfit = imfit('imgG192_6s_spw0-63_mfs2.image.tt0', region='G192.crtf') + ' Jy'
   print 'Source flux = '+str(myfit['results']['component0']['flux']['value'][0])+'+/-'+str(myfit['results']['component0']['flux']['error'][0])

The patch version was implemented by hand in the patch module, the interactive is set to false.

@injectEnv
def test_58_basebands_mfs_taylor_cleaning():
   """ "basebands mfs taylor cleaning" patched to iteractive=False
   """
   casalog.origin("test_58_basebands_mfs_taylor_cleaning")
   casalog.post("starting")
   # Removing any previous cleaning information
   # This assumes you want to start this clean from scratch
   # If you want to continue this from a previous clean run,
   # the rm -rf system command should be be skipped
   os.system ('rm -rf imgG192_6s_spw0-63_mfs2*')
   clean(vis='G192_split_6s.ms', spw='0~63:5~122', \
      imagename='imgG192_6s_spw0-63_mfs2', \
      mode='mfs', nterms=2, niter=10000, gain=0.1, \
      threshold='0.0mJy', psfmode='clark', imsize=[1280], \
      cell=['0.015arcsec'], \
      weighting='briggs', robust=0.5, interactive=False)
   #
   mystat = imstat('imgG192_6s_spw0-63_mfs2.residual.tt0') + ' Jy'
   print 'Residual standard deviation = '+str(mystat['sigma'][0])
   myfit = imfit('imgG192_6s_spw0-63_mfs2.image.tt0', region='G192.crtf') + ' Jy'
   print 'Source flux = '+str(myfit['results']['component0']['flux']['value'][0])+'+/-'+str(myfit['results']['component0']['flux']['error'][0])

The regression will executed the patched method as a verification and replacement for the method defined in helper, the method decorator would be:

@injectMod(module_patch)
@injectMod(module_post)
def test_58_basebands_mfs_taylor_cleaning(self): pass

rather than:

@injectMod(module_helper) 
@injectMod(module_post)
def test_58_basebands_mfs_taylor_cleaning(self): pass

Bear in mind that this definition is done in the template and should be decided on a previous analysis.

Execution workflow

framework-commented.png

The proposed framework

The framework was implemented to run any CASA test in mind (guides, regression and unit). The framework use in-house decorators to manage the CASA libraries to be inserted at runtime for the test methods, sort of dependency injection.

Templates, parsing and code generation

Airspeed is used as an engine for the template and code generation, currently guides.template is used to generate RegressionBase > unittest.TestCase pyunit test classes, in which each key-phrase and his python code within the guide is a test case method of the generated class.

The parser will generate two python modules:
  • regression_{"guide"}.py : The pyunit class, which is a specialization of (inherits) RegressionBase > unittest.TestCase, defining the methods and the python decorators to manage the testing and the post testing.
  • guides_helper_{"guide"}.py: This is a module with all the methods with the CASA code, each one for each keyword-phrase found, intented to be invoked by the regression class.

An example of a regression class generated, this is the EVLA G192 with just one method:

"""
This is a generated module
all modified changes will be lost in the next code generation
"""

import sys

assert sys.version >= '2' and sys.version_info.minor >= 7, "Python 2.7 or greater is supported"

import os
import unittest

from testc.regression.helper import RegressionHelper
from testc.regression.helper import RegressionBase
from testc.regression.helper import regressionLogger
from testc.regression.helper import injectMod

__test__ = True
__all__ = ["Test_EVLA3BitTutorialG192"]

module_helper = "guides_helper_EVLA3BitTutorialG192"
module_post   = "guides_post_EVLA3BitTutorialG192"
module_patch  = "guides_patch_EVLA3BitTutorialG192"

class Test_EVLA3BitTutorialG192(RegressionBase):
  """Testing class for EVLA_3-bit_Tutorial_G192 casa guide

  This is an autogenerated class for EVLA_3-bit_Tutorial_G192 guide testing purposes,
  all the modified code will be re-written in the next code generation.

  The class will test the following phrases:

  test_00 "splitting fields for analysis"

  In order to skip a test, append the pyunit decorator: @unittest.skip("reason?")
  """

  @classmethod
  def setUpClass(cls):
    pass

  def setUp(self):
    pass

  def tearDown(self):
    pass

  @classmethod
  def tearDownClass(cls):
    pass

  @injectMod(module_helper) 
  @injectMod(module_post)
  def test_00_splitting_fields_for_analysis(self):
    """test 00 EVLA_3-bit_Tutorial_G192 "splitting fields for analysis"
    """
    pass

The generated method test_<auto-incremental-id>_<keyword-phrase>, is using injectMod in-house developed decorator to execute the needed modules before executing the current method, this design is used to keep simplicity and modularity, the decorator execution is top-down, if you want to skip the test execution you must add the unittest.skip decorator to the top of the decorator list.

Visit Velocity user guide documentation to know the template directives, macros, etc.

Not all that glitter is gold

Strictly to the guides, is difficult to control some behaviors related to a task, e.g.: if a cleaning task is set as interactive=True, the regression test will run forever, the solution is execute a method from a patch module instead of execute the helper method of the helper module, this can be controlled at parsing-time by defining a specific template in the config.json which implements if/ifelse/else flow control to choose which module to use, kind of a monkey patch, e.g.:
# not patched!
@injectMod(module_helper) 
@injectMod(module_post)
def test_16_bandpass_calibrator_gain_amplitudes_scaling(self):
  """test 16 EVLA_3-bit_Tutorial_G192 "bandpass calibrator gain amplitudes scaling"
  """
  pass

# patched!
@injectMod(module_patch)
@injectMod(module_post)
def test_17_spectral_information(self):
  """test 17 EVLA_3-bit_Tutorial_G192 "spectral information"
  """
  pass

Decorators

Think of as a stack of function wrappers, which are executed (pop) in a top-down order, which the function decorated is executed at the end. These decorators acts like dependency injection.

injectMod

from testc.regression.helper import injectMod

Executes another method from a module M before executing the method itself. The decorator takes as a argument the module M to inject, this module M` should be either located in testc.regression or testc.guide, by first looking if the module M is located at testc.regression.

By default it will execute the method of the module ```M``` which is named same as the method using the decorator, e.g.: before executing someMethod(), the module1.someMethod() will be executed first.
@injectMod("module1")
def someMethod(): pass

As the following example shows, setting the method parameter to False means that will not execute the method, instead will execute the module, this is intended for modules that are executed at import runtime:
@injectMod("module1", method = False)
def someMethod(): pass

Also is possible to specify the method name, e.g.: before executing someMethod(), the module1.someOtherMethod() will be executed first.
@injectMod("module1", method = "someOtherMethod")
def someMethod(): pass
  • take in count that there's no recursive restriction, that means: don't specify module and method of the method using the decorator.
  • the arguments of the module to execute should be the same of the method using the decorator.

This decorator is used in the regression test classes, in order to separate the CASA code from the post tests and execute them in order.

injectEnv

from testc.regression.helper import injectMod

This decorator is used in the helper modules generated, the purpose is to inject the needed CASA python globals, this allow to execute CASA code without explicitly define the needed imports to execute CASA python routines, classes, methods, etc.
@injectEnv
def test_17_spectral_information():
  """ "spectral information"
  """
  setjy(vis='G192_flagged_6s.ms', field='3', scalebychan=True, \
        fluxdensity=[29.8756, 0, 0, 0], spix=-0.598929, \
        reffreq='32.4488GHz')

In the previous example, CASA globals, methods, routines, etc... are injected into the method.

How to

First, clone this repository: git clone git@github.com:atejeda/casa-testing.git, and for simplicity, export the following envvars:
export CASA_TESTING=$PWD/casa-testing
export PYTHONPATH=$CASA_TESTING:$PYTHONPATH
export PATH=$CASA_TESTING/testc/guide:$PATH
export EXTRACTED=$CASA_TESTING/guides/extracted

Configuration file

The CASA guides to work with must be specified in a guides.conf configuration file, which is basically a JSON file.

{ 
    "base_uri": "http://casaguides.nrao.edu/index.php?title=",

    "guides": [
      { 
          "enable": 1, 
          "uri": "EVLA_3-bit_Tutorial_G192", 
          "guide": "EVLA3BitTutorialG192.py", 
          "template": "guide_EVLA3BitTutorialG192.template", 
          "template_helper" : "helper.template" 
      }
    ]
}

The structure is defined as:
  • enable: ```1``` for enable or ```0``` for disable the extraction and merge
  • uri: URI, relative to the ```base_uri```
  • guide: The file name of the script extracted, ```.py``` extension is required.
  • template: The template to use in order to generate the regression unit test class and module.
  • template_helper: The template for the class helper, which will generated the code snippets for the parsed script.
A one guide is defined by the curly brackets, you can add more guides within the square brackets (comma separated). By using a configuration file, allows more flexibility to manage or group the tests in a easy way.

Extraction

casaGuideExtract -c $CASA_TESTING/guides/guides.conf -o $EXTRACTED
Usage: casaGuideExtract [options]

Options:
  -h, --help            show this help message and exit
  -b, --benchmark       produce benchmark test script
  -n, --noninteractive  make script non-interactive (non-benchmark mode only)
  -p, --plotmsoff       turn off all plotms commands
  -c CONFIG, --config=CONFIG
                        Get the guides specified in a json file
  -o OUTPUT, --output=OUTPUT
                        output dir for files

For the record, this script was inherited and slightly modified.

Merge

casaGuideMerge -c $CASA_TESTING/guides/guides.conf -e $EXTRACTED -o $PARSED
Usage: casaGuideMerge [options]

Options:
  -h, --help            show this help message and exit
  -c CONFIG, --config=CONFIG
                        The configuration file to use
  -e EXTRACTED, --extracted=EXTRACTED
                        Where the extracted scripts are
  -o OUTPUT, --output=OUTPUT
                        Where the generated code will be

Execute

You have to be sure that the testc is installed in the lib directory of your CASA deployment. The following code snippet can be executed within a CASA environment:
import sys

assert sys.version >= '2' and sys.version_info.minor >= 7, "Python 2.7 or greater is supported"
assert globals().has_key("IPython"), "IPython environment is needed for this module (%s)" % __file__
assert globals().has_key("casa"), "CASA environment is needed for this module (%s)" % __file__

import os

# to use psutil for nose psutil pluging
sys.path.append("/usr/lib/python2.6/site-packages/psutil-2.1.3-py2.6-linux-x86_64.egg")
# to use xcoverage in nose xcoverage plugin
sys.path.append("/usr/lib64/python2.6/site-packages/coverage")

from testc.regression.helper import RegressionRunner

# configure regression tests to execute
regressions = []

# configure guides tests to execute
guides = []
guides.append("regression_EVLA3BitTutorialG192Eg")

# test the regression
for test in regressions:
  RegressionRunner.execute(test)

# test the guides
for test in guides:
  RegressionRunner.execute(test, guide = True)

The only difference with the regression tests, from a execution point of view, is that the guide = True should be specified for the RegressionRunner.execute in order to find the regression test class in the guide package, an automated way can be easily implemented to locate the module, but was done in this way in order to avoid collisions with the module file name.

In order to know how to use RegressionRunner and RegressionBase helper methods, refer to the regression documentation.

Dependencies

This code snippet belongs to the regression script to be executed by jenkins.
$WORKSPACE/casa-testing/lib/python/site-packages/install
export PYTHONPATH=$WORKSPACE/casa-testing/lib/python/site-packages:$PYTHONPATH
export PYTHONPATH=$WORKSPACE/casa-testing/lib/python/site-packages/airspeed-0.4.2dev_20131111-py2.6.egg:$PYTHONPATH
export PYTHONPATH=$WORKSPACE/casa-testing/lib/python/site-packages/coverage-3.7.1-py2.6-linux-x86_64.egg:$PYTHONPATH
export PYTHONPATH=$WORKSPACE/casa-testing/lib/python/site-packages/psutil-2.1.3-py2.6-linux-x86_64.egg:$PYTHONPATH
  • Install the needed dependencies: airspeed, coverage, psutil
  • Update the python path

The install script is located at lib/python/site-packages/install which is the same location of the library installation.

Resources

[1] https://github.com/atejeda/casa-testing

-- AlexisTejeda - 2015-06-01
Topic revision: r2 - 2015-06-01, AlexisTejeda
This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding NRAO Public Wiki? Send feedback