ALMA Nutator Specifications and Requirements

TIP Last Update: JeffMangum - 20 Aug 2007


ALMA Nutator Specifications and Requirements

See for extensive discussion on requirements. Additional information can be found below...

-- AlWootten - 13 Jul 2005

Solar Observations

On 2007/08/16 Baltasar Vila Vilaro asked the following two questions, which resulted from Nutator PDR:

  1. Currently they are calculating the heating during solar observations on the Nutator using the specified values for the solar power reflected by the main-dish panels and the sun CENTERED on the beam. Some additional structural calculations have been made for a small radial gradient in temperature of a couple of degrees. The AI is whether is there a need to cope for HIGHER radial gradients in temperature across the Nutator mirror whenever observations are made NEAR the Sun.
  2. Whenever the Nutator is not used for an extended period of time, it is possible to STOW it (fix its position with specific pins and power it down). The current spec for the Stow accuracy is as high as that of the pointing. This is QUITE hard. The Manufacturer wants to know if it would be possible to relax that spec a bit and instead measure with sensors the actual OFFSETS and return the values back to the main computers to include in the pointing. (Actual spec is negotiable).

JeffMangum responded to each point:

  1. How exactly was the final specification for solar observations described? The approved version of the specs said the following regarding solar observations:
    • ALMA-35.03.--.00-002-A-SPE, Version of 2006-05-09: In the "Environmental Conditions" list (, it reads: "The Nutator System shall operate with no degradation when experiencing any combination of the following environmental conditions unless otherwise stated in this section"...a list follows (temps, Temp grads, Wind, dust and Solar Flux). For Solar Flux the numbers are: up to 6200 W/m2 In the "Solar Observation Conditions" subsection (5.3.2) it reads: "The ALMA Antennas will be used regularly fort solar observations that can last an entire day. Thus, the Subreflector shall be capable of sustaining a continuous maximum heat flux of 6200 W/m2 with no loss in performance". But it also notes (as already agreed by the Sci IPT) "the Nutator system is not required to satisfy the switching requirements but must meet all the other requirements". As far as I am concerned this means that the nutator should meet spec during any solar observation, whether it be while observing the center of the Sun or off-center.
  2. I guess the question is whether the position sensors (jacks for hexapod, linear sensors of some type for translation stages) can be used during stow to monitor the absolute position? If powering the system off results in a reset of these sensors, then I would have a problem with not being able to know where the nutator was positioned during stow. It is also not clear to me if the nutator will maintain its position during a stow-state. If it does, then perhaps a measurement of the nutator position just after stow but before power-off could be used as a reliable position indication.

-- JeffMangum - 17 Aug 2007

ALMA Nutator Scientific Requirements by AlWootten summarizes much of the discussion below.

Minutes of 25 May Nutator Telecon

Nutator-Computing ICD DAR #150

Scientific Requirements

-- JeffMangum - 14 Apr 2005

Much of the following is distilled from the general discussion section below...

  • Maximum Beam Throw ± 3.0'
    • Driven by 40 GHz == 7.5mm ==> FWHM = 2.15', so you need at least ±2.15' throw, maybe even a bit more. NOTE: This requirement may be a bit soft, as BS at 40 GHz may not be necessary (see Questions and Answers below).
    • The prototype nutator had two modes:
      • Up to +- 1.5 arcmin in 10 ms, 10.4 Hz repetition rate
      • Up to +- 5 arc min in 20 ms, 5.2 Hz repetition rate
  • 2 Hz motion
    • The transitions are triggered by the 48 ms system pulses. This sets the fastest repetition at 96 ms = 1 / 10.4 Hz. Unless the pulse rate changes, that's it.
  • Dynamic balance
  • 10ms transition time
  • Minimal effective surface degradation


See James Lamb's discussion in ALMA Memo 246. James' recommendation from this Memo is as follows:

"It is clearly very important to decide on the nutation frequency, duty-cycle, and beamthrow to finalize the decision on optics. We recommend switching close to the center of mass at a switching rate of 5 Hz or less and 90% duty-cycle, and statically but not dynamically balancing the mirror."


May 2000 discussion

Guilloteau comments on nutator on ACA antennas (at this time, only n small antennas comprised the ACA).

Nutator Monitor and Control

BrianGlendenning points out that, at one time the proposed monitor and control interface was to be as follows:

  1. A program was to be defined with 1ms time resolution. The "program" would basically be a series of moves/dwells, and would run a set number of cycles or forever until stopped.
  2. The program would be strobed "active" by a 48ms pulse so it could be synchronized with data taking (at the time up to 2ms for TP detectors, 1ms for correlator auto-correlations).
  3. In the event of an error, a "sticky" flag would be raised that could be noticed for blanking/flagging purposes.

This model would have allowed complicated patterns at any rate the hardware was capable of, at the cost of having a local nutator 1kHz clock that was synchronized "well enough" with the 48ms one. I assume this is no great feat since they could be resynchronized every 48ms so surely stability wouldn't be much of an issue.

For some reason this model was not adopted (even though until a few months ago I assumed it was), and at this point I think we should leave the interface alone unless we are really forced to change it. But if we have to change it I think we could do much better.

Performance of the Prototype Nutator

In general, the prototype nutator did not perform well on the ALMA prototype antennas. Its biggest problem was that it would fail (go into its limits) in windy conditions (winds greater than a few m/s).

In the following I list the problems, some of which are not design failures, but limitations of this particular design:

  • Went into its limits when wind speeds were greater than a few m/s.
  • Was not weather proof. Water leaked into its electronics, causing connector failures.
  • Could only be stowed by-hand with a stow pin.

-- JeffMangum - 27 Apr 2005

Nutation Reference Point


The following discussion of aberrations is distilled from ALMA Memo 246 (James Lamb)

Nutation of a secondary mirror not only tilts the wavefront to give the desired beam throw, it leads to two undesirable aberrations:
  • Coma, which produces path errors proportional to (theta_s), and gain degradation proportional to (theta_s)^2
  • Astigmatism, which produces path errors proportional to (theta_s)^2, and gain degradation proportional to (theta_s)^4.

In general, the aberrations are dominated by coma, except when the center of rotation is near the prime focus, where the astigmatic term dominates. The ray tracing calculations presented by James Lamb in ALMA Memo 246 (Figure 3) indicate that the effective surface error degrades by approximately:
  • 45 microns (from 20 to 65 microns) for a beam throw on the sky of +-3 arcmin
  • 25 microns (from 5 to 30 microns) for a beam throw on the sky of +-1.5 arcmin
  • 15 microns (from 1 to 16 microns) for a beam throw on the sky of +-0.75 arcmin
Note that:
  • This error is large in spatial scale, stable, and common to all antennas, so should be calibratable for extended source measurements.
  • For point source measurements, this additional error represents a loss in gain.

Darrel Emerson comment, 2005-05-27

On the permissible image degradation
Looking at Fig 3 in James' memo: you can draw horizontal lines
on the plot corresponding to the number of microns, for a given frequency,
that degrade the antenna gain by 1%.  If nothing else, that means that
a coma lobe can't be greater than 1%.
   Just using Ruze, assuming a 25 micron dish surface error from other
causes, I calculate that the allowable surface degradation from nutator
switching for a 1% degradation in gain is:

     Frequency  Microns
      30        79.8
     100        24.1
     300        8.4
     950        4.2

   So, you can draw horizontal lines on James' Fig 3 at
4.2, 8.4, 24.1 and 79.8 microns to correspond to the 1% degradation
at 950, 300, 100 and 30 GHz.  Clearly the higher frequencies are
more difficult.

   If +/-1' nutation is required at 950 GHz, you can draw a line
on Fig 3 that's vertically 66.6% of the 1.5 arc min curve James drew.
(For small angles the effective rms surface error on the vertical
axis seems to be linearly proportional to the beam throw, which
probably isn't surprising.)
The intercept of this new +/-1' line with the horizontal line
we just drew at 4.2 microns defines how far the center of rotation
can be from the prime focus for 950 GHz.  Just by eye, I get about
62 mm for that.
   Looking at figs 5 and 6 in James' memo, that's already a significant
easing of engineering requirements compared to rotating about the
prime focus.  However, I don't know if it's enough.  This is the
point where we need to talk to mechanical engineers.

   Note that if +/-1 arc min at 300 GHz rather than at 950 GHz were
the requirement, the rotation center could be about 170 mm from the
prime focus, which from the engineering point of view has achieved about
90% of the relaxation you'd get by rotating about the center of mass.
(Just by-eye estimates, no calculations here.)
   Alternatively, if 2% degradation were acceptable at 950 GHz,
then the rotation center could be about 100 mm from the prime focus
for a +/-1' beam throw, which would help engineers a bit too.

   Most of these numbers are by-eye estimates from James' graphs,
so it wouldn't be hard to be a bit more precise.  However, I think
the numbers are about right, and probably sufficient to start a dialog
with mechanical engineers.  Any thoughts?


The protoype nutator nutates about the secondary mirror center of mass. Nutation about the prime focus incurs some mechanical disadvantages (see ALMA Memo 246 (James Lamb):

  • The moment of inertia (I) decreases as zc^2, where zc is the distance of the center of rotation from the prime focus. See Figure 5 (bottom) in ALMA Memo 246.
  • The ratio between the beam throw on the sky (theta_sky) and the angular throw of the subreflector (theta_s) decreases as zc. See Figure 5 (top) in ALMA Memo 246.
  • For a fixed switching frequency and blanking time, the peak and average power required to drive the nutator increases as I. From Figure 6 in ALMA Memo 246, the peak power required is approximately 10 times larger for rotation about the center of mass than for rotation about the prime focus.

Questions and Answers

Beam Throw Versus Frequency / Efficiency Tradeoff

  1. Since the larger beam throws are necessary only at the lower frequencies, where the degraded antenna efficiency is less of a problem, is the degraded performance due to a center-of-mass chopping secondary acceptable? -- JeffMangum - 14 Apr 2005
    • Al Wootten correctly pointed out that at high frequencies, while you might not need to have a big throw (because your beam is smaller), you still might want to have a big throw (because the sources are large) -- that will improve your sensitivity for large sources, as you aren't differencing so many noisy quantities. So I think the first question is a bit off the mark. -- MarkHoldaway - 27 Apr 2005
      • I am not sure that what Al says is correct. Can't we use EKH to restore BS measurements of a largish source? -- JeffMangum - 28 Apr 2005
  2. Can fast position switching at the lower observing frequencies give us the same performance as a beam switched measurement, thus avoiding the problems encountered with a center-of-mass chopping secondary for large throws? -- JeffMangum - 14 Apr 2005
    • To answer this question, I would say "DO OTF" rather than fast position switching or BS for the lowest frequencies. Or, fast position switching for spectral line. For the smallest sources (ie, one beam width), I think you take a hit in sensitivity for OTF, as you spend a lot of time off-source. For somewhat larger sources, OTF becomes more efficient. I don't think we should expect every single observing mode at every frequency to be absolutely optimal. The SZ people may disagree, however. Again, in continuum, it is the 1/f noise that will usually be the limit, not the atmosphere. -- MarkHoldaway - 27 Apr 2005

More Miscelaneous Questions

  1. Will there be multiple modes of operation, and if so, how many and what are they? The prototype Nutator can do +/0/- as well as +/- switching. Is this necessary?
  2. What will be the duty cycle?
  3. What are the expected number of cycles in lifetime of Nutator?
  4. Do we still need space for calibration equipment in the nutator?
  5. What pointing accuracy is required (For stowed and when switching)?
  6. What surface accuracy is required of the subreflector?

General Discussion

On 2005/01/15 Al popped the question...

Hi Mark and Darrel and Simon and Jeff,

We need to begin setting specs for the nutator.

Are the current amplitude and throw suitable? The Construction Project Book 
Throw ± 1.5', 10 Hz motion, dynamic balance, 10ms transition time.

Is a single direction throw suitable?

We all agree it needs to nutate about the focal point I think.

Does anyone know where more complete previous specs are as the basis 
for discussion?  Did the nutator meet these specs?

Clear skies,

Faster than photons fly Mark responded...


1) The beam throw isn't big enough for 7mm:

40 GHz == 7.5mm ==> FWHM = 2.15'
SO, you need at least ±2.15' throw, maybe even a bit more,
unless we are cutting & running on Band 1.

2) I don't think 10Hz is fast enough.

My take on rate:  for freq = 650, 850 GHz, 10 Hz
switching could well be insufficient, or at least
non-optimal.  Simon's nutator had an 80 or 90% duty cycle
at 10 Hz -- I would have though that it could then go for
20 Hz at 60 or 80% duty, but I think he had a reason
why it couldn't (a reason better than "software").

IF we can get a nutator that goes at like 10 Hz and 90% duty
cycle, AND we can reprogram it to be 20 Hz and 80% duty

3) I think constant EL throws are fine.


Simon then responded with some very useful links...

Al, Mark,

The prototype nutator has several modes:

1) up to +- 1.5 arcmin in 10 ms, 10.4 Hz repetition rate,
2) up to +- 5 arc min in 20 ms, 5.2 Hz repetition, and
3) three way nutation, left-center-right-center, with these transition times.

The transitions are triggered by the 48 ms system pulses. This sets 
the fastest 
repetition at 96 ms = 1 / 10.4 Hz. Unless the pulse rate change, 
that's it.

The protoype nutates about the secondary mirror center of mass. 
Nutation about the prime focus incurs an extreme mechanical 
disadvantage. For the same nutation parameters, James Lamb 
(memo 246) estimated the power requirement is ten times larger 
for nutation about prime focus! The size (mass) of the mechanism 
will likely scale with the power requirement. Two cautionary tales: 
The SEST nutator moved about the prime focus and was abandoned. 
The APEX nutator moves about the prime focus and is late.

Two axis nutation (tip-tilt) would be much more complicated.


Via Marc Rafal, Nicholas Emerson provides the following summary of the pivot position issue...

From: Nick Emerson [] 
Sent: Tuesday, April 12, 2005 5:31 PM
To: Marc D. Rafal
Subject: nutator pivot position

I have been looking into the nutator pivot location and why it is where 
it is.  I went over most of these on the phone with you, but here they 
are summarized.

The current nutator design has the subreflector rotating about its 
center of inertia.  Mechanically this is the most efficient location, 
and results in the smallest design.  However this also results in a 
larger effective rms surface error when switching off of boresite.  The 
ideal location (to reduce surface error) for the pivot point is the 
prime focus, which is 215mm further back than the current pivot point.  
This leads to lots of other problems, some described below.  I am using 
equations and estimates from James Lamb's MMA Memo 246 for some of 
these, and also input from Simon and Darrel.

- The motors are currently used to counterbalance the movement of the 
subreflector.  If the center of rotation was about the prime focus, then 
the motors would have to be moved back beyond the prime focus.  I don't 
think there is room for this in the current design, without pushing back 
the whole apex structure, which would compromise the close packing 

- If the pivot is at the prime focus, the subreflector has to rotate 
about 66% further than it does now.  To maintain the <10ms transition 
time you would probably need larger motors, which probably means more 
weight.  Larger motors are hard to find without more complications.

- As the rotation center moves away from the center of inertial of the 
secondary, the moment of inertia of the mirror+additional counterbalance 
needed rises.  At the prime focus it is on the order of 3 times larger 
than when at the secondary mirror center of inertia.  Again, this 
probably means larger motors, and more power needed (J. Lamb estimated 
about 10 times more power needed for prime focus rotation)

Simon mentioned two nutators that have been designed to rotate about the 
prime focus.  One at IRAM for SEST, which was expensive and never 
completed.  Another for Apex which is apparently late, expensive, and 
has a worse switching performance than ours.  I don't have the 
specifications of theirs though. 

The surface degredation with the rotation at the secondary center of 
inertia may not be quite as bad (as in the plot in Memo 246) if you 
consider that at higher frequencies the beamwidth is smaller so the 
switching distance needed will be lower, so the effect of rotating the 
secondary will have less of an impact.  At lower frequencies, although 
you have to switch further, the surface error doesn't have as much of an 
effect on efficiency.

A more detailed analysis needs to be done to compare the pros and cons 
of the two cases, and perhaps to find a compromise between them.


-- NickEmerson - 18 Apr 2005

Via Nick, Mark adds his perspective on the issue


I don't claim to have the total view on nutators,
but I may see some issues with a valuable perspective.

OK, we've got this conflict about the nutator design:
about the center of mass, or about the prime focus?

Which antennas need nutators, and when?
This is important, because of the close packing issue. The four ACA 12m
dishes will have nutators, but they won't be close-packed, so we could put
any design on them that we want to, except that the contract has already
been signed, and this change in the ACA 12m design will cost something (and
also, they already have one built that would need to be retrofitted). (If we
put prime-focus-rotating nutators on the four ACA 12 m antennas, we MAY need
to shift the locations of those four antennas, moving them out by a meter.
Very low impact.)

We also need nutators on the antennas at the OSF as we shake them down.
We'll probably need two there?  Could get by with one?  I forget what the
integration/commissioning plan requires. Anyway, most of that work, being at
90 and 230 GHz, could be done JUST FINE with a center-of-mass nutator.

Will any other antennas in the ALMA-64 need nutators?

What Errors Do The Nutators Make?
Slow nutators (ie, prime-focus-rotating nutator?) result in imperfect
cancellation of atmospheric emission, which means HIGHER NOISE, especially
at the highest frequencies (but at some point OTF will step in, so not all
is lost  --  of course, you may argue that if OTF is stepping in, WHY have
this discussion at all?   Safety net.)

Center-of-Mass nutators will degrade efficiency at high frequency because
they add surface errors.   HOWEVER, the required beam throw distance will 
decrease with increasing frequency.  At 40 GHz, we need to go over 2 arcmin,
which gives ???? 35 micron errors -- which is NOT SO BAD at 40 GHz. At 400
GHz, we need only go 0.2 arcmin -- the surface errors and loss of
sensitivity will be much smaller.   QUESTION:  how does this sensitivity
loss compare with the sensitivity loss due to slower nutation in the
Prime-focus-nutator case?

Center-of-Mass nutators will have a systematically incorrect beam. As in the
preceeding paragraph, this won't be a problem at low frequencies, and it
won't be a huge problem at high frequencies because the throws will be
smaller and smaller.   I think the most important thing is that this error
is symmetric, ie, that the beam integral is the same for ON and OFF --
which i think should be the case?  HOWEVER, if beam shape errors IS a
problem, as James Lamb notes you can in principle correct for it as it will
be a reproducable error, common to all antennas measuring total power.
Basically, we would need to deconvolve the total power data for this odd
beam-shape.  We would have a lookup table with every COMMON beam-throw and
frequency combination, with the ON and OFF 2-d beam shapes tabulated, and
then perform a deconvolution before using the total power data in imaging.
With the ACA, I believe (but don't know for certain) that we are not
planning to deconvolve the total power  --  if we were doing "homogeneous
array" mosaicing, basically without the 7m dishes, we would have to
deconvolve the total power data anyway.  This current problem would make the
deconvolution more complicated, though, as the 2-D beam would rotate on the
sky with time. This is complicated, but not insurmountable. It would give
us something to do in the period 1-5 years after ALMA commissioning.

Basically, my estimate on this is that a non-prime-focus nutator would
make some errors, but would also permit pretty good imaging, and that we
would eventually make software/imaging improvements to make it be as good
as a prime-focus nutator (I am assuming that increased noise from a slower
prime-focus nutator will be about equal to the increased noise from a
center-of-mass nutator). I suggest we come up with a compromise design
which is as close to the prime-focus nutator as the mechanical engineers
are happy going.


Via Nick, Al responds to Mark's comments

I'm not so sure about this, as I think the assumption that one can get
away with small throws at high frequencies isn't correct.  But we need to
put numbers to these things and get real specs and reqs out.  Sources
get bigger in general at higher frequencies, and the big sources are the
ones for which we need the TP data.  OTF while chopping using the
Emerson-KH algorithm can recover some of this perhaps.

Perhaps the DRSP could guide us on high freq source sizes but I think it
is somewhat skewed to SCUBA sources and small objects by the direction the
buffalo herd is currently embarked upon.


Via Nick, Mark responds
OK -- good point, Al. However, as we go to larger sources, we will
cross over into the regime where OTF is more favorable than beam switching.
I should look at that again with more current numbers on beam throw,
switching rates, etc, but my old work indicated that the cross over point
was at fairly small source sizes.

I request some guidance: how many antennas will indeed be getting nutators?


Al responds to the "How many Nutators Question"
Hi Mark

I think only the JP 4x12m get new nutators.  I don't know if the current
nutators can be used on production antennas for testing.  There should be one spare,
so the total complement should be five.


Summary of APEX Nutator performance goals from Dr. Konrad Pausch from Vertex


the chopper we are building for the APEX telescope is ready and we will
perform in-plant acceptance testing in the next couple of days. We have a
target specification which we hope to achieve. The chopper fits into the
ALMA prototype headpart. It has a CFRP subreflector with the ALMA optics.
Unfortunately the chopper spec. is written in German language. Its key
requirements are translated as follows.

Amplitude on sky 0 to ± 300 arcsec for frequencies < 2Hz. This means for
the APEX/ALMA optics with 96m total focal length  and a distance of 5.882
meters between subreflector vertex and main reflector focus ±0.68 degrees
tilt angle of the subreflector. The frequency is adjustable from 0 to 2 Hz
w/o steps. The end positions on the sky shall be held for 43% of the chop
time each with 1 arcsec (3sigma rms) accuracy (i.e. app. 87% efficiency at
2 Hz).

Position readouts every 8 ms per ABM CAN BUS. Every single chop motion is
initiated via blanc/sync pulse from the ABM.  The subreflector mass is
balanced by a separately motorized ballast (linear motor). There is a
motorized position lock in the center. Chop axis is perpendicular to the
telescope el axis. No visibility through the center axis is possible.
Environmental conditions as for ALMA.

I hope this information is useful for you. Maybe next week I can tell you
what numbers we finally achieved.

Best Regards.

Dr. Konrad Pausch 

Darrel Emerson comment, 2005-04-19

On the maximum useful chopping distance

This is oversimplified, but there's a frequency-independent upper limit to how far it's useful to chop. If the beam switching is used to reject the effects of variations in the atmosphere, this relies on the same atmosphere being in both near-field beams. Roughly, the beam in the near field is a cylinder of diameter equal to that of the dish, i.e. 12 meters. If you assume all the atmosphere is in a layer 2 km above the antenna, then if the beam throw is as much as arctan(12/2000), or about 20', there would be no overlap in the near-field beams and so no atmospheric cancellation at all. If you want to retain, say, 90% efficiency in cancelling the atmospheric variations, you'd only want to step about a tenth of that, so a total throw of 2', or +/-1'. This is independent of observing wavelength. It's a graceful degradation, so +/-2' would still give 80% atmospheric cancellation in this simple model; that's consistent with practical experience on telescopes of this size.

This ignores lots of things, including the atmospheric structure function - the large scale structure in the atmosphere does cancel even if the near field volumes don't overlap. However, it is the relatively rapid, fine scale (10-meter scale size or so) atmospheric variations that we're trying to cancel by beam chopping. Also, as you get away from the zenith, the effective distance to the annoying atmospheric layer increases, so the beam throw at which good atmospheric cancellation still occurs gets smaller. Further, the "tube" near-field antenna beam concept above hasn't taken account of the antenna illumination taper, so the max beam throw for good atmospheric cancellation might be reduced a little more.

If beam chopping is performed as a way of minimizing effects of atmospheric fluctuations, rather than avoiding receiver 1/f fluctuations, a beam throw of as much as 6', i.e. +/-3', may be of marginal use at any frequency.

Cheers, Darrel.

MORE On the maximum useful chopping distance; and on switching speed

OK, I've just run a small mess of beam switching simulations, which are based on a frozen 2-D water vapor distribution at 500m elevation above the site, which is appropriate for night-time conditions (daytime may be higher). The frozen screen blows over with a 12 m/s veolcity (what we think to be representative of winds aloft). The water vapor screen is consistent with typical phase structure functions (root structure function exponent of 0.6), and we scale the amplitude of the water vapor fluctuations to represent different conditions (ie, good sub-mm conditions would be like the 10% phase stability conditions). I divied up the various observing conditions to the different frequencies the same way that I did for my earlier total-power memo (ie, mainly giving the good phase stability conditions to the highest frequencies, and throwing away the worst 10% of time).

I can then simulate total power observing through this atmosphere with various beam switching strategies. Beam overlap is accounted for in the ON-OFF subtraction, but beam overlap is not the entire story -- even if the beams do not overlap, the phase structure function indicates that nearby beams will have very similar water vapor columns. I also have realistic estimates of the thermal noise. The goal in total power observations: for the residual sky brightness fluctuations (converted to Jy) to be less than the thermal noise. (BTW: I don't do any accounting for optics wrt the distance of the beam throw or the throw center.)

Two quick notable results:

1) Contrary to Darrel's simple computation, for 10 Hz beam switching, I find fairly mild increase in the total noise (atmosphere + thermal) as the beam throw increases. For example, at 680 GHz (ie, very good phase conditions, but not the best, which are reserved for Band 10), the total RMS of all ON-OFF at 10 Hz is dominated by thermal noise until you get to throws of 10 arcmin:
Beam Throw     Total RMS
[arcmin]       [Jy]

0.1              0.054
0.215            0.052
0.464            0.055
1.00             0.050
2.15             0.054
4.64             0.054
10.0             0.075     
For small throws, we are seeing just thermal noise (0.050 - 0.055 Jy), and we are getting a similar contribution from the atmosphere at 10 arcmin throws to increase the total noise to 0.075. Similar behavior is seen in other bands. Note, however, that this noise is for wide bandwidth continuum observations, which will actually be dominated by 1/f noise (which I did not add for this work). Spectral line observations won't be bothered by 1/f, and their narrow bandwidth will have higher thermal noise, and so the atmosphere would not begin to dominate the same way at 10 arcmin throws.

2) Nick Emerson told me that the APEX prime focus nutator nutates at 2 Hz. My initial impression was that thsi was a bad thing (2 Hz, not prime focus). So I did some 2 Hz simulations as well. With NO atmosphere, just thermal noise, the RMS level for the same 680 GHz beam switched observations is about 0.02 Jy (ie, about a factor of sqrt(5) less then the 10 Hz noise values), but when we include the atmospheric fluctuations, that increases to between 0.04 and 0.06 Jy for the different beam throws (OK, I admit this result isn't clean yet, as there is too much fluctuation in those values), with no clear dependence on the throw distance (ie, at 2 Hz, we are dominated by the atmosphere blowing past in each integration, rather than the throw distance).

What is happening at 2 Hz is: we have longer switching cycles, the thermal noise is smaller, but the atmospheric fluctuations are larger, and they end up dominating. AGAIN, the same two warnings apply: these are wide bandwidth continuum results, and 1/f noise will actually dominate continuum observations; and spectral line observations will not be bothered by 1/f noise, and will have a much narrower bandwidth and hence won't be bothered by the atmosphere either (which is only a factor of 2-3 worse than thermal noise for our continuum case).

SO (and contrary to what I set out to prove): I think the 2 Hz nutator speed is actually not a problem. Continuum observations will be dominated by 1/f noise, and not the atmosphere, and spectral line observations will be dominated by thermal noise.

-- MarkHoldaway - 27 Apr 2005

The 10.6m antenna of the CSO has employed a chopping secondary for some time. The pivot point of the secondary is the prime focus; the torque exerted on the secondary support structure when the mirror is negligible owing to deployment of a moving counterweight system.

Continuum observing

The amplitude of the chop (also known as the chop throw), in arc seconds varies up to the maximum permissable value is 540". Normally observers use values between 90" and 120". One continuum default available is to run the chopper at a frequency of 1.123 Hz with a chopper throw of 90". Up on starting up the chopper, the observer may need to tune up the chopper so a reasonable duty cycle (>80%) can be achieved. This entails setting the four parameters, P, D, I, and G on the chopper control box in the sidecab to values specified in a table posted on the sidecab wall.

Heterodyne observing

The second parameter is the chopping frequency, in Hz. For heterodyne observing, there appears to be little or nothing gained by chopping faster than about 1/3 Hz. The observer should keep the chop frequency as low as possible since the duty cycle drops dramatically as the frequency increases. T she observer should choose a chop frequency that does not have an integral harmonic at 60 Hz, in order to minimize 60 cycle pick-up (and cold head noise). The default is 0.321 Hz.


The default chopping frequency for continuum measurements is 7.8125 Hz for both SCUBA and the heterodyne receivers. This gives reasonable atmospheric cancellation and good mechanical performance for at least the shorter chop throws (1-2 arcmin). `Beamswitched' spectral line observations use a lower chop frequency, usually 1 Hz. Chop throws larger than about 3 arcmin are not recommended.

-- AlWootten - 27 Apr 2005

This topic: ALMA > WebHome > AlmaNutatorSpecs
Topic revision: 2010-02-09, ToddHunter
This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding NRAO Public Wiki? Send feedback