IRM Phase B NA Testing Procedure (pragmatic version)


The deployment procedures of ACS, ALMASW and patch installation are described in STEMaintenanceDocumentation.
  • STE: Standard test environment, this name refers to the ALMA environment, used in production though
    • Note: when ssh'ing to the Charlottesville STE, use: ssh -X -l almaop to ensure GUIs will open on the local machine
  • GNS: General network services, in charge of the network services within the STE, this is managed by a DHCP service, provides "static" configuration and environment for all the ALMA STE
  • GAS: General ALMA services, where really the ALMA Software runs, each ALMA component and container are deployed in these machines
  • ARCH: Archive, where the ALMA Oracle DB is deployed
  • Support: Acts as a LDAP service and repository data
  • TMCDB: Telescope Monitoring Data Base, in charge of keeping the ALMASW configuration.



Within the ALMASW context, container logs are generated in a plain ASCII, these logs contains basic information of whats going on within the container. Also exists the XML or system logs, these contains all the logs generated by the system of all containers and components.
  • ASCII (container logs) are located in /alma/logs/{gas0x}/, the patter is followed by deployment configuration, e.g.: if the simulated antenna DV01 was deployed at gas01, the log will be located at /alma/logs/gas01/CONTROL/DV01
  • XML are located at /mnt/gas01/data1/AcsLogs-8.1
    • the location is defined in, e..g: /alma/ACS-2014.6/acsdata/config/ by the property archive.log.dir or simply run: cat $ACSDATA/config/ | grep archive.log.dir
    • the logs tends to be huge in numbers of entries and file size, using a text editor might be inefficient unless you know what you are looking for, otherwise as any STE user use jlog (a GUI application to read the logs).

Testing Procedure

System setup

  1. Backup the TMCDB configuration by using tmcdb-explorer
  2. Deploy ACS
  3. Deploy ALMASW
  4. Check for software configuration changes, often done in TCMDB explorer, these can vary depending on what's needed
  5. Update the database schema, check

Start the system

Bring up the system all the observatory operations are done as almaop:
  1. Restart the system to kill all remaining ACS processes, you may get some harmless errors: FullSystemRestart -df
  2. Execute the runOMC tools, which is the GUI tool to start the system: runOMC
  3. Start ACS, this step can take a while depending on the STE resources, you can do this in the OMC GUI.
  4. Once ACS has started, containers on left side of the OMC should be in a green state, then start the ALMASW.
    1. this step can take a while, the critical online subsystem are: Archive, Control, Corr and TelCal

Create an array

These steps are specific for simulated environments, other environments like AOS needs more steps procedures among other tasks:
  1. Create an array: View > Create Array
  2. Select the antennas to use
  3. Select the photonic reference
  4. Select the BL correlator, this is optional for total power observations.
  5. Wait for the array creation, may take a while
  6. The scheduler should assign a SB automatically, if not, grab one in READY status.

Observation sources

Bear in mind that the observations are simulated to be executed in Chile, as almaop will plot the visible sources for a period of time, this tool require to have the system up:
Usage: [options]

  -h, --help            show this help message and exit
  -d Date, --date=Date  Date (YYYY-MM-DD).  default: today
  -s Source(s), --source=Source(s)
                        Source name(s): planets or ones defined in
                        sourceCatalog. concatenate with commas for multiple
                        items.  default: all planets and the sun (if -S/-Q are
                        not given) or none (otherwise)
  -S radecSource(s), --radecsource=radecSource(s)
                        Source coordinate(s): concatenate with commas for
                        multiple items.  each item is a set of RA and Dec
                        separated by a slash (like HH:MM/dd:mm).  default:
  -Q, --brightquasars   Plot "standard" set of bright quasars.  default: False
  -x Xaxis, --xaxis=Xaxis
                        X aixs ("UTC", "LST", "CLT", "CLST").  default: UTC
  -y Yaxis, --yaxis=Yaxis
                        Y aixs ("AZ", "EL", or "AZEL").  default: EL

Running an observation

Once the array is created, the usual regression test consist of running total power and SFI observations. These are the testing observations/testing done during in a normal regression test, TDM, FDM can be set for SFI observations, as almaop:
  • -b 3 #,this will lock the frontend of the antennas used to band 3.
  • mountAxisToAutonomous [-a {list of antennas, comma separated}] # this is optional for simulated environments
  • [-a {list of antennas comma separated}]
  • -b 3 -s Venus # TP, usually band 3 and a planet as a source
  • -b 3 -o 1924-292 # SFI, band 3 and a quasar as a source
  • -b 3 -s 1924-292 # SFI, band 3 and a quasar as a source


The usual regression test results relies on the functional successful of the observation scripts (no visible errors), within this context the meaning of functional is the lack of real data to reduce due no real or raw data is produced by CORR simulation.

Other results depends on the test needed, e.g.: reviewing a log to see if a simulated device was really initialized, and that must be checked case by case.

TelCal Standalone

TelCal or Telescope Calibration is a subsystem of the ALMASW in charge of doing logical calibrations for the observations. TelCal has two modes, the online and offline often called as TelCalSA, SA stands for Standalone.

The standalone version is a set of TelCal C++ libraries plus a set of python scripts packaged in a tarball, these python scripts works on top of CASA. Since CASA (installation) was removed as a dependency from the ALMASW and casacore was integrated as a thirdparty library, TelCal doesn't build his standalone version due needs a full installation of CASA, an installation not configured for ALMASW buildfarm servers.

Build TelCal Standalone


Now only one (complete) version of CASA is needed for TELCAL standalone on build machines since the CASA for on-line is built in the ICD subsystem.
So if you can install a (complete) version of CASA (I think that's ok for 40.0.21723-002) in any directory (but /alma/ACS-XYZ/casa is also ok) AND be aware that the CASAPATH variable in .bash_profile.acs has the right value (currently CASAPATH="$ALMASW_INSTDIR/casa $(uname -s | tr [:upper:] [:lower:]) local $(uname -n)"), that's fine for me.

This setup must be done in an ALMA environment with an existing and valid deployment (at least having the binaries in place) due TELCAL build relies on other modules, specifically ICD. The branch to use must be the same branch deployed in the STE or the environment configured.
  1. First it's needed to checkout TelCal from svn: svn co{branch | trunk}/TELCAL
  2. Export CASAPATH, as is described in the ticket, pointing to a full CASA installation (typically is just an uncompressed tarball)
    1. grab one of the stable releases from (any >= to 40)
      1. preferably:
  3. Be sure that TELCAL_ONLINE environment variable is unset, otherwise it will not build the standalone version
  4. Follow the Software/STEMaintenanceDocumentation#Patches procedure until point #4 (inclusive).
  5. Within TELCAL: make build MAKE_PARS=-j{number of cores} # will take some time
  6. Grab the tarball file generated in $INTROOT/lib/ and install it.

Install TelCal Standalone

Once the tarball is packaged, this code snippet may help to do the installation, TELCALSA is the tarball location without the .tar.bz2 extension. CASAROOT is where CASA is installed, this must be the same version used to build TelCal standalone.

#!/usr/bin/env bash
tar xzf $CASA.tar.gz
rm -rf $CASAROOT/telcalsa
mkdir $CASAROOT/telcalsa
tar xjf $TELCALSA.tar.bz2 -C $CASAROOT/telcalsa
./ -v --check-paths
./ -v --casa-dir $CASAROOT

Test execution

Add to the beginning of the path casapy-{version}.../bin (the one where TelCal was installed) and add it to the LD_LIBRARY_PATH casapy-{version}.../lib as well.
  • Run casapy-telcal
  • List the tasks: tasklist, the tc_* ones should be there if not, e.g.: execfile("../casapy-40.0.22208-001/telcalsa/telcalsa-20141017-2014-04-b/tasks/")
  • Lets use the pointing task: inp tc_pointing # or invoke help for more info
  • Execute the task: tc_pointing

Basically the python task are python wrapper of the c++ classes, the same one used for the online software. The results of test depends on what is tested and the expected results, often this kind of test is just compared values nor look the plots generated by the task.


-- AlexisTejeda - 2015-04-17
Topic revision: r9 - 2015-07-14, AkeemWells
This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding NRAO Public Wiki? Send feedback