CASA User Testing Home
Welcome to the CASA user testing pages. These pages are intended to become the repository for all high-level information related to CASA user testing.
Basic Process From The User Tester's Point Of View
- Science user testing is the last step in validating fixes put into the software by the CASA developers. Some fixes and features will not require a science tester -- these may just require verification testing -- but if the code addition is for a new feature or a bug fix, someone from Science needs to approve that the implementation makes sense. This is called "validation testing".
- The JIRA ticket system is used to track issues in the CASA software from their inception (as a requested fix or feature) to their ultimate validation testing, and if the testing is successful, at that point the tickets can be Resolved and Closed.
- Once an item is ready for user testing, the CASA Testing Lead will assign the CASA JIRA ticket directly to a NA tester(s) to indicate that the item is now ready for direct consideration by Science. The assignment will come with a testing plan and report, which includes a description of the issue and a testing plan already filled out by the Testing Lead, and a reporting section to be filled out by the Tester. The report is included below in the next section for reference. The Testing Lead will also change the ticket status from "Ready To Validate" to "Under Validation". Tarballs of the CASA builds to use for testing are linked in the ticket.
- For items to be tested in other regions (e.g. EA, EU), the ticket will be assigned to the local testing lead (usually Kana Sugimoto for EA and Dirk Petry for EU) for testing or reassignment to their chosen local tester. Items that are to be tested at the JAO are handled on a case-by-case basis.
- While testing is ongoing, any necessary correspondence with the developer(s) should happen via the JIRA ticket.
- THERE ARE TWO BASIC TYPES OF TICKETS. Not all tickets require the same level of testing effort.
- About half of tickets assigned test a specific parameter or error message, and so the validation testing represents our final approval that the developer's fix has sufficiently addressed the problem.
- The other half of tickets assigned require some more creativity with regard to testing the general idea touched upon in the ticket. If the issue is a more open ended one, the developer has likely already tested the added feature or reported bug on a small representative dataset, which means it is immensely useful for someone else to try it out on their own data and see if it behaves as expected. These types of testing tickets are more focused on trying to "stretch the legs" of the new feature (and break it if possible so that it can be fixed and further improved) before it's released to our users.
- The final testing results should be posted to the ticket, along with sufficient details of any particular datasets used, as per the report. Any impediments to testing encountered should also be noted there.
- When sufficient testing has been done, a clear comment to this effect will be posted to the ticket (this may be done by either the Testing Lead, the developer, or the Subsystem Scientist), and the Testing Lead should then change the status of the ticket to Scheduled or Resolved, as appropriate.
- For major items requiring input from both ALMA and VLA testers, the Subsystem Scientists should ideally both provide comments to the ticket if they believe the Resolved status is not justified.
- If the item is switched back to Scheduled, then it will be returned to the list of items requiring work. If flagged as "Resolved", then work on the item will end.
- Where possible, the CASA Testing Lead will try to ensure that any testing ticket is always ultimately be reassigned back to the developer for bookkeeping purposes, after the item has been flagged as Resolved, etc. (Please nag her if it's unclear which developer should have the ultimate assignment and/or if she does this incorrectly.)
When filling out the report, please keep the following in mind:
- Test functionality: does the new feature what it promises? Is it relatively easy to understand? Are there possible data sets one can think of where the procedure may not work?
- If the tester does not have a suitable dataset to test, they should please ask Jen [or Crystal (ALMA), Amy Kimball (VLA), or Juergen] to provide one.
- Are any new keywords introduced that are not consistent with other CASA tasks?
- Check the output on the terminal: is there anything strange or unusual?
- Check the output in the logger. Check for accuracy, completeness and whether or not it is understandable.
- Check the inline/online documentation (help 'task', doc(task)) for accuracy, completeness, and whether or not it is understandable. Note any changes or suggestions.
- Let Juergen know anything should be added to the CASAguides page, e.g. to the Hints, Tips, and Tricks section.
- Suggest any improvements that could be considered for future releases.
- Suggest on how the features could be integrated in automated tests that verify the CASA integrity.
The testing report looks like this:
- CASA Issue: (ticket #, title)
- Description of Issue: (1-2 sentences)
- Additional Instructions:
- Expected turnaround time:
The CASA JIRA system is available here.
- Testing Results
- -- Revision number used:
- -- Details regarding data set(s) used for test (mfs/line, number of channels, EVLA/ALMA, etc):
- -- Further details of testing:
- -- Is the issue resolved (and if no, please describe the changes still required, as well as further suggestions for user clarity)?
- -- Are updates to the inline and/or online documentation necessary (and if yes, please provide suggestions)?
Useful filters for JIRA from a testing standpoint can be found on JDM's shared dashboard.
Nominal Key Dates And Release Schedule
General Development Cycle:
- "Prerelease" builds of the master branch contain all feature and bug fix branches that have been merged to date. Do a "casa-prerelease -ls" to see the full list of available prereleases; generally you'll want to start the most recent one with "casa-prerelease -r 5.X.X-XX".
- Public releases are done twice / year, following a ~1-month period of user testing of "Prerelease" packages and bug fixing.
- Code Freeze Dates: March 15th, September 15th. The prerelease packages start to be generated just after this.
- Public Release Dates: April 15th, October 15th
Current snapshot of user testing and assignments.
- 2015-05-26 <- <span class='foswikiRedFG'>NOTES ONLY: MARK AWAY ON TRAVEL
- 2015-04-28 <- <span class='foswikiRedFG'>MEETING 1 WEEK LATER THAN USUAL (DUE TO ALMA PROPOSAL DEADLINE)
- 2014-12-16 <- <span class='foswikiRedFG'>MEETING CANCELED
- 2014-12-02 <- <span class='foswikiRedFG'>NOTES ONLY. ACTUAL MEETING CANCELED
CASA 4.2.2 (and 4.3.0)
- 2014-04-15 <- <span class='foswikiRedFG'>INTERIM NOTES ONLY. ACTUAL MEETING CANCELED
- 2014-04-01 <- <span class='foswikiRedFG'>NOTES ONLY. ACTUAL MEETING CANCELED
- 2014-01-14 <- POSTPONED FROM PREVIOUS WEEK.
- 2014-01-07 <- <span class='foswikiRedFG'>NOTES ONLY. ACTUAL MEETING POSTPONED BY A WEEK
- 2013-08-20 <- Mark in Chile: Jeff to Chair
- 2013-08-13 <- Mark in Chile: Jeff to Chair <span class='foswikiRedFG'> DEFERRED FROM 2013-08-06
- 2013-08-06 <- Mark in Chile: Jeff to Chair <span class='foswikiRedFG'> DEFERRED UNTIL 2013-08-13
- 2013-05-21 <- Meeting canceled, but please review the agenda.
- 2013-05-07 <- Mark on travel: Jeff to Chair