We have at last gotten around to configuring the sideband flippers
so that the correlator produces net upper sideband data regardless
of the sideband of the data presented to the station board and were
surprised to find that at X band the phase rocking due to the residual
delay error was magnified rather than removed by phaserr. Up until
now we had set the sideband flippers as if the data were always net
upper sideband, never touched them again, and accounted for net LSB
in the phase calculation.
The reason is simple: In the default (USB) flipper state the phase
change is introduced before the filter, flipped in the odd FIRs,
flipped again in the odd flippers and phaserr has the right sign to
to shift the average phase to zero in each subband. Similarly for
the even subbands where neither the FIR nor the flipper change the
phase. In the LSB configuration, the phase is flipped once by the
odd FIRs, or once by the even flippers, and every subband now has
its sense of phase reversed wrt to the phase error introduced before
I see three ways to deal with this.
1. Brent or Dave tells me about a bit in the filter FPGA I have
overlooked that can be set to account for this.
2. Make D1_DEPE in the filter FPGA a signed fraction and arrange
for software to provide the negative of the computed correction
factor (properly scaled) if the station board input is net LSB.
3. Ignore the issue, use the flippers as we have in the past,
account for the reversed fringe frequency once more in the phase
calculation, and convey net sideband to the CBE (which currently
does not know this) and correct everything at that stage.
I hope that the first solution is possible, but failing that
suggest the FPGA change described in the second solution.