Consolidate CASA System Tests

Consolidate all CASA tests that require a complete CASA build to run into one location in change control.

Currently, tests that require a complete CASA build to run are scattered between the code and gcwrap build components. Data used by these tests are kept in a separate change control repository, which frequently falls out of sync with test code.

By keeping test code and required data together, we expect developers will have less trouble keeping tests and data in sync.

By splitting tests that require a complete CASA build to run out from code and gcwrap into their own directory, developers who will not run these tests do not have to deal with the change control overhead for these tests or their data. Developers who do run some of these tests will be able to easily check out code and data for just the tests they are interested in. Developers and test automation that runs all tests will be able to get them from one change control repository that should always be consistent.

Status: See CAS-5590.

Team

Customers
  • CASA Developers

Collaborators

Plan

  1. Start with tests developed or maintained by the new Software Test team.
  2. Document new CASA system test guildelines.
  3. Review existing CASA developer tests that require a complete CASA build to run to see if they should be kept.
  4. Move the keeper tests and data to the new location.
  5. Remove the non-keeper tests.
  6. Remove unused data from https://svn.cv.nrao.edu/svn/casa-data/trunk/regression/

  • Related:
    • Chose a code coverage measurement tool.
    • Chose a memory error checking tool.
    • Note: Valgrind looks like it can do both, and is already used by ALMA.

Deliverables

  • Guidelines for writing new CASA System Tests

Requirements

  • Each test must be contained in its own directory, along with all required data and supporting tools or scripts.
  • Each test directory must describe the CASA feature being tested. The "main" test script must also describe the feature being tested.
  • We must accumulate test run data to report on:
    • developer build or user package (suggested by Justo).
    • the name of each test
    • where in CASA does each test fail?
      • This may require analysis by testers after test runs.
    • how often does each test fail?
    • which tests fail most frequently?
    • are test run times changing?
    • is test run memory use changing?
    • how much of CASA do tests cover?
    • which parts of CASA break most frequently?
    • which developers break CASA most frequently?
      • Must be extracted from other reports and Subversion logs.
    • I/O statistics
      • Amount of filesystem data read
      • Amount of filesystem data written
      • Histogram of data read/write sizes (e.g from iostat ratio of MB/s divided by read request per sec)
    • Number of files open in the process
  • Reports must allow selecting time windows (6-months, one month, etc.).

Design

  • TBD

Implementation

  • Tests will be kept in change control at https://svn.cv.nrao.edu/svn/casa/trunk/system-test
    • This directory will be branched and tagged following the same conventions as all other CASA source code in https://svn.cv.nrao.edu/svn/casa/trunk/
    • Tests will be grouped as:
      • asap - tests that exercise ASAP integration with CASA.
      • general - tests that exercise CASA as used by "typical" users.
        • we may choose to subdivide this further in the future.
      • pipeline - tests that exercise the ALMA/EVLA pipeline.

Example:

Note: this is not an ideal example, but it is the first test we will implement.

End example

  • We must record data on each test run for each test, including:
    • name of the test failing
    • revision of each build component under test (casacore, code, gcwrap, asap, pipeline)
    • type of build [release | debug | instrumented | etc.]
    • test host
    • test result (pass/fail - number of sub-test failures if applicable)
    • test run time (real, user, and sys)
  • Test run data will be accumulated in flat files and recorded in Subversion in a "results" sub directory of the test directory.
  • We will write scripts to read the flat files, pull additional data from Subversion, and report as required. See Requirements.

Tasks

Project tasks are tracked with Jira sub-tasks under this project's master task. See CAS-5590.

-- ScottRankin - 2013-09-10
Topic revision: r11 - 2014-06-18, ScottRankin
This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding NRAO Public Wiki? Send feedback