PST Review Cookbook

PST Review Tasks

We have added functionality to the Reviews Setup page to include the VLA configuration and a date/time for the deadline that is used to set the proposal id's properly. For this to work we have to perform several tasks in the right order. Below is an example for 15A and 15B. Some dates:

  • 07-Jul-2014: 15A CfP and PST release date.
  • 01-Aug-2014: 15A proposal deadline.
  • 05-Jan-2015: 15B CfP and PST release date.
  • 02-Feb-2015: 15B. proposal deadline.

Time Task Comment
07-Jul-2014 Close 14B Review Cycle This can be done before this date whenever the review process is complete.
07-Jul-2014 Create 15A Review Cycle This step will only exist for 15A.
01-Aug-2014 Create 15B Review Cycle This can be done before this date. Best not to wait until the day of the deadline!

05-Jan-2015 Close 15A Review Cycle This can be done before this date whenever the review process is complete.
02-Feb-2015 Create 16A Review Cycle This can be done before this date. Best not to wait until the day of the deadline!

PST Review Cycles

Most of the PST functionality for the review process is contained within the Review page. PST administrators will need to use the Proposal List page, however, to assign or alter the assignment of reviewers as well as to change any content for the individual review. The PST review process is divided into difference cycles. These are controlled on Reviews-->Reviews Setup-->Review Cycles. Because the VLA configurations are specified in the Reviews Cycle page each new cycle should be created after the PST release but before the Call for Proposals is sent out.

  • Start Cycle:To start a new cycle click the "Add Cycle" button in the top right hand corner. You will be prompted to add a title (e.g., 2011 Aug), a proposal deadline (e.g., 12A - 8/1/2011), a review deadline date (e.g., 9/20/2011), and what cycle to copy reviewers (e.g., 2011 Feb). Beginning 14B - 2/3/2014, the VLA array configurations that will be available for the specified semester are indicated with a set of checkboxes. Select all VLA array configurations that will be available. Currently, there is no "Any" configuration in this list. For 15A, we would like to add this (for the UI).
    • Tasks to perform before moving to the next phase: N.B., We need to move to the Reviewers Assigned stage before the scientific categories can be vetted. Since it takes time to setup the Technical reviewers, in practice we only setup the Science reviewers before moving to the next stage.
      1. Check that the following is setup properly:
        • Categories: The categories are setup properly. The science categories should not change very often but the technical categories might need to be updated every cycle.
          • Scientific categories at Reviews-->Reviews Setup-->Science Categories
          • Technical categories at Reviews-->Reviews Setup-->Technical Categories
        • Pools: The reviewer pools include all the potential reviewers. Each pool is a list of usernames that can be used for the specified review. The pool is notconnected to any specific category.
          • Science Pool at Reviews-->Reviews Setup-->Science Pool
          • Technical Pool at Reviews-->Reviews Setup-->Technical Pool
          • TAC Pool at Reviews-->Reviews Setup-->TAC Pool
        • Reviewers: The reviewers are associated with the correct categories. For each cycle we need to associate the users with the specified category. Make sure each scientific and TAC reviewer is associated with a valid institution.
          • Scientific reviewers at Reviews-->Reviews Setup-->Science Reviewers
          • Technical reviewers at Reviews-->Reviews Setup-->Technical Reviewers
          • TAC reviewers at Reviews-->Reviews Setup-->TAC Reviewers

  • Assign Reviewers:The scientific reviews are assigned to individual proposals automatically by clicking on "Assign Reviewers" under the Action column. Currently, the technical reviewers are assigned manually.
    • Tasks to perform before moving to the next phase:
      1. Process some VLA/VLBA technical reviews automatically (typically those < 20 hr). Procedure: go to the Proposal List page and filter on "VLA + VLBA"; select "All" proposals; and click on the Assign Default Technical Reviews button. You have to do this one page at a time. It will pop up a dialog to allow you to choose the hour limit. When you click ok, it assigns the default category (mjc) and a default review text and closes the review. Also, for the GBT process the technical reviews that only use MUSTANG2 automatically.
      2. Assign all technical reviewers (now done via file import).
      3. Change the proposer-suggested scientific category, if necessary, to the so called NRAO-vetted science category.
      4. Run script to reset any erroneous review flags.

  • Accept Reviews:Once the reviewers have been assigned the next cycle is to start accepting individual reviews. By clicking on "Accept Reviews" under the Action column the My Reviews page (Reviews-->My Reviews) for each reviewer will be populated. The process starts with the reviewer checking for any conflicts of interest and to accept these conflicts by clicking on the "Accept Conflict Status" button. Then the appropriate proposals are available to the reviewer and reviews can be input into the PST. A review has several states: not touched, saved, conflict, completed, or closed.
    • Tasks to perform before moving to the next phase:
      1. To change the conflict status go to Proposal--->Proposal List-->Review and edit the review. This cannot be performed in the UI if all reviews for a given reviewer have been completed. In this case the database has to be edited directly. A proposal should not be both conflicted and completed so do not click both boxes.
      2. If there is a proposal that a reviewer is not comfortable reviewing then wait until the reviewer has completed all remaining proposals and then close out the remaining reviews. This is done by going to Review-->Reviews Summary, clicking the radio button for the reviewer in question, and then clicking the Mark Closed button. Reviews that have a closed status will not be considered for review.
      3. If there are any reviews that do not have either a conflict, complete, or closed status and yet all reviews are deemed done then use the Mark Closed button as described above. This has to be done per reviewer. N.B., make sure that indeed all reviews are thought to be done since closing a review out will remove it from further consideration.
      4. If a reviewer has finished reviews but has not checked the complete checkbox, and prefers that NRAO do this for them, the Mark Complete button can be used. This is done by going to Review-->Reviews Summary, clicking the radio button for the reviewer in question, and then clicking the Mark Complete button. This procedure will only work once, however, so make sure that the reviewer has entered in all information.

  • Normalize Reviews: Once all of the individual reviews for a given SRP have been completed the next cycle is to prepare for the SRP teleconference. The reviews are now normalized per SRP. On the Reviews Setup page (Reviews-->Reviews Setup) there is a separate link under the Action column for each SRP. For example: "Normalize Reviews - Active Galactic Nuclei". Clicking on one of these links will normalize the scores and populate the Panel Reviews page (Reviews-->Panel Review) onlyfor the SRP in question. This page provides a summary of all proposals for a given science category for the SRP teleconference. Here consensus comments are created, including any minor adjustments to the scores. The chair clicks "Finalize Reviews" to complete the process.
    • Tasks to perform before moving to the next phase:
      1. Make sure all reviews for each SRP are finalized. This can be performed either by the SRP chair or a PST admin.

  • Linearize SRP Scores: After the SRP teleconferences are complete and they have finalized their scores we need to prepare for the TAC meeting. This includes supporting documentation that is sent to the TAC members. Currently this requires that data be exported from the PST into the PHT and GBSE. Clicking on "Linearize SRP Scores" under the Action column will generate the normalized linear-rank scores. (The GBT XML listed under the Downloads column is no longer used but included data for older GBT software maintained by Carl Bignell.) The TAC meeting summary page will notbe produced here since we want to release this at a later time. Also, the Panel Reviews page will still exist after this step.
    • Tasks to perform before moving to the next phase:
      1. Edit the list of holdover proposals. These are proposals that were submitted in previous semesters that need to be reconsidered by the TAC in the current semester. This often happens with multi-configuration VLA proposals. These proposals will show up on the TAC Meeting page with a darker background than normal proposals. This list cannot be edited using the UI and thus one of the software folks has to manual input this info.
      2. Import linear-rank scores into the PHT.
      3. Import linear-rank scores into the GBSE.
      4. Add any TAC members. The SRP Chairs are by default TAC members and will be able to view the TAC Meeting page. But sometimes we want to include people who are not SRP chairs to be on the TAC (e.g., SRP chair cannot attend the TAC meeting). To include them you have to add the user in question to the TAC Pool and include them as a TAC Reviewer for the semester in question.

  • Accept TAC Reviews: Here we want to prepare for the TAC meeting. Clicking "Accept TAC Reviews" under the Action column will populate the "TAC Meeting" page (Reviews-->TAC Meeting). A summary of all proposals will be listed. Here the TAC will (optionally) enter comments for proposers and will assign priorities to each proposal. In practice, assigning priorities will happen using external tools and then be imported back into the PST.

  • Close Cycle: After the TAC meeting and the Director's review the cycle can be closed by clicking on "Close Cycle". No changes can be made to the reviews after this step.

Below is a table that summarizes the status and action for each stage of the review process.

Stage Status Action
1 New Assign Reviewers
2 Reviewers Assigned Accept Reviews
3 Accepting Reviews Normalize Reviews
4 Reviews Normalized Linearize SRP Scores
5 SRP Scored Linearized Accept TAC Reviews
6 Accepting TAC Reviews Close Cycle
7 Closed  

PST Scoring Guide

Background

Each science review panel (SRP) normally has five (5) reviewers. This does not include the SRP chair who will only review proposals that have conflicts of interest. There are eight (8) SRPs, each covering a scientific subset of proposal subjects. The chair of each SRP is a member of the time allocation committee (TAC).

Scoring Steps

  1. Raw Scores: Each reviewer reviews the proposals submitted to his/her panel and assigns a score. This is a number between 0.1 and 9.9, with a lower score indicating a better proposal. Since there are professional conflicts, the reviewer does not review a proposal on which s/he is conflicted, and this is indicated by a score of 0.0. Also, if a reviewer does not feel competent to evaluate a proposal they will not enter any review, and this is also indicated by a score of 0.0. Each reviewer will typically score 30-50 proposals.
  2. Normalized Scores: No two reviewers treat scores in the same way. To impose some uniformity on them, normalized scores are derived from the "raw" scores. Although these are often not Gaussian, there is some attempt to make the normalized scores Gaussian. For each reviewer, the mean and standard deviation of the (non-zero) raw scores is derived. The normalized scores for each proposal are then: \hat{s} = a*s + b, where a = 2/\sigma(s) and b = 5 - a*\bar{s}. Here \bar{s} and \sigma(s) are the mean and standard deviation of the raw score (s), respectively. If the parent distribution is Gaussian, this will transform it so that it has a standard deviation of 2 with a mean of 5.
  3. SRP Scores: Once all of the scores are normalized the mean and standard deviation are calculated for each proposal. The mean normalized score is copied to a separate field call the SRP score. The value of the SRP score can be manually altered by the SRP chair during the SRP teleconference.
  4. Linearized Scores: When all SRPs have completed their reviews, the SRP scores are linearized by ordering them between 0 and 10. Specifically, the proposals are ranked (R=1, 2, 3, etc.) and then the linearized score is just: R*10/N, where N is the number of proposals in the SRP. This linearized score is used for the TAC meeting.
  5. Scheduling Priority: The TAC then assigns what are called scheduling priorities. Initially NRAO sets the scheduling priorities. (Add what is currently done here.) The table below describes these priorities which are slightly telescope dependent.

Priority VLA/VLBA GBT
A The observations will almost certainly be scheduled. Highest priority. They will be considered for scheduling for up to two semesters (one year). We expect that these projects will receive most, if not all, of their granted observing time.
B The observations will be scheduled on a best effort basis. Next highest priority. They will be considered for scheduling for only one semester. We expect that these projects can receive a significant portion of their granted observing time. We recommend that the project team consider visiting Green Bank to help increase their chances of completing their project.
C The observations will be scheduled as filler. Lowest priority. They will be considered for scheduling for only one semester. We recommend that the project team consider visiting Green Bank to help increase their chances of receiving observing time on the GBT.
N* The observations will not be scheduled because they were explicitly rejected by the TAC. These projects will not be scheduled because they were explicitly rejected by the TAC.
N The observations will not be scheduled because they could not fit in the time available. These projects will not be scheduled due to the lack of available observing time.
H Not assigned because the proposal is being held for consideration at a future TAC meeting. These projects will be held over for re-consideration during the next proposal cycle.

Scoring Analysis

Add memo.

-- JoanWrobel - 2013-03-16
Topic attachments
I Attachment Action Size Date Who Comment
pstScoresFred.pdfpdf pstScoresFred.pdf manage 7 MB 2013-06-17 - 17:07 DanaBalser Anaysis document via Fred Schwab
Topic revision: r28 - 2018-08-01, DanaBalser
This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding NRAO Public Wiki? Send feedback