Imperative for large focal plane arrays (high data rate).
More than just a handy quick-look tool.
Defaults should be sensible to avoid the need to reprocess the data.
Defaults may be context specific (vary with use case).
Reprocessing is painful.
Not a data processing environment (i.e. not interactive like GBTIDL or AIPS).
Likely to be useful in processing data from other receivers with more leisure data rates.
Other telescopes routines provide roughly-calibrated data to their users - most institutions consider this the starting point of data reduction.
Goals of the Prototype Pipeline (KFPA Critical Design Review - Jan. 2009)
Support KFPA commissioning
Explore new processing tools/techniques not yet widely available in GB (vector calibration, statistical data flagging and editing, visualization, parallel processing).
Prototype an automated pipeline - add necessary meta data to capture user intent
Prototype tools necessary to support larger focal plane arrays (e.g. parallel computing)
Based on prototype tools, estimate costs associated with delivering a pipeline and necessary computing hardware to handle the expected data rates for a larger focal plane array.
complicate calibration schemes (e.g. "basketweaving" or it's FPA equivalent)
cross-correlation
continuum
"The initial observing modes should be kept simple yet effective - even at the cost of reduced flexibility for observers." - GBT KFPA Critical Design Review Final Report.