EJM: Anyone who is interested in using radio data as part of their research.
From Amanda Kepley:
We should better define who the users are to better focus the discussion. A quick definition and some thoughts below:
- SRDP user is a graduate student in astronomy or physics or a Ph.D.-level astronomer or physicist. They are generally familiar with astronomy jargon (RA, Dec, etc) as well as the astrophysical processes underlying radio astronomy (synchrotron emission), but not not be familiar with terminology specific to radio astronomy (e.g., primary beam, visibility, etc). I anticipate three major groups of users:
- [CASUAL USER] Some users may want to use SRDP to get an image quickly without having to know much about radio astronomy. These users may use archival data at other wavelengths extensively, so connections with other data archives (e.g., MAST, NED) would be good here.
- [FUTURE POWER USER] Others might start at a basic level and then get more sophisticated (i.e., grad students, astronomers who have experience with data at other wavelengths and want to take advantage of the capabilities of ALMA/VLA, etc.). These users would typically start out with fairly basic questions, but would increase in sophistication as they gain experience with SRDP.
- [CURRENT RADIO ASTRONOMERS] Although SRDP is not aimed at the current radio astronomer population, per se, I think that interfacing with and maintaining this user base is crucial. Novice users will ask these users for advice on how to process their images. If they are able to answer, yes, this SRDP thing will get you want you and its good then thats key. Even experienced users will likely tend to do things the easiest way. If youre teaching a bunch of classes, writing grants, and supervising students, of course, youre going to use pre-made data products if you trust them. [Think pies from Whole Foods vs. making your own.]
List for elaboration
- Recalibration (re-execute the pipeline on a given data set)
- Combination (in same version)
- Standard Calibration:
- Standard Imaging:
- Optimized Imaging:
- Time Critical Observation:
- Large Projects Data
- Data Decimation / averaging / selection (pre or post calibration)
- Image selection and sub-selection
- PI Projects -- will want calibrated data and imaging products requested per their science goals.
- Archival Research -- calibrated data and uniformly reduced imaging products for analysis
To me, 0th order goal is properly calibrated data available ready for imaging/analysis, as specific science products may be pretty specific based on needs of the PI or archival user. That said, a uniform set of imaging products to assess the calibration/data quality is highly useful. In both cases, a tool that allows the user to access/image the data as they require in an intuitive way is desirable.
From Amanda Kepley:
- DDT or time-domain projects that need a quick turn around on an image
- excluding pulsar based projects, but this could include LIGO follow up, SNe, etc.
- small PI-based projects (<n hours or <m data rate)
- large PI-based projects (e.g., CHILES or perhaps VLASS, although the latter is more of a service organization)
- The one-off archival user
- only interested in a handful of objects for say a proposal or to complement some other observations at another wavelength
- The large archival user
- interested in mining the archive for data and potentially re-processing large amounts of data
- The re-processing case could be resource intensive have archival proposals?
- How should the calibration strategy be controlled so that we can process it? Or should leave it open and only process what is doable?
- Where do we start? VLA or ALMA? Focus on one or the other first? Parallel?
- Do we change the VLA observing process to be similar to ALMA, where projects continue until science goals/targets achieved, at least for grade A projects & filler projects (that have no actual end point). This eliminates having incomplete programs taking up space in the archive (sure this is controversial).
- Openly support all archival programs in a similar way? Or provide a proposal process for larger archival programs that require a significatn amount of computing resources?
From Amanda Kepley:
- For large projects, should the project team play a more of a role in the QA assessment and project set up of their data than say a small PI-driven project?
- I say yes. Large projects have a lot of responsibility to get the data right. Need to interface with them to make sure the data products they are getting meet observatory specifications.
- Should PIs should be given the option of whether or not they want to use SRDP? This is related to Claires email about the data products.
- Opt in? [Carrot]
- Could be as easy as having a box on the proposal: Your project setup is compatible with SRDP. Would you like to have SRDP products produced for this project? This would require using an observatory standard observatory procedure to ensure that the data can be calibrated automatically by the pipeline using best practices.
- I.e., we are asking the PIs if they want to make life harder or easier for themselves. I think many people would choose the easy path if they have it.
- Need to make sure that the instrumental setup is okay for SRDP.
- Enforce limits? [Stick]
- 75% of projects need to be SRDPable? If not, you have to justify it.
- Im thinking that the opt-in approach is the one that might be most successful with the VLA user base. People can still do hard things, but can also make life easier for yourself.
- If someone is doing a large-archival project is that the same as a large PI-based project?
- I.e., do you include them in the data processing/QA?
- Although software is a major component of SRPD, I think that user education is of equal importance. This is a new way of working with radio astronomy data and we need to communicate effectively with our users for them to make the most of it. What is the most effective way(s) to educate the astronomical community about SRDPs? Workshops (in person or online)? On-line tutorials? Helpdesk? In-person visits? All of the above?