Peer Annotation Instructions

The goal of the annotation of a peer summary is to identify all spans of text that match an SCU in the associated pyramid. Unlike the DUC 2005 peer annotation, text that does not match an SCU should not be selected/annotated; we will use the modified pyramid score, which does not require unseen SCUs to be identified.

If you have not done so, download the DUCView interface, and familiarize yourself with its functionality by reading the section Getting Started with DUCView. Once you have done so, you can load your pre-existing pyramid and read in your new peer summary. You can try out a sample using the D633-CDEH.pyr sample pyramid, and D633.M.250.G.10 sample peer summary.

It is easier to annotate the first peer if you first read through the pyramid to familiarize yourself with the topic, and with some of the SCUs. Notice the three panes in the DUCView interface; if you see only two, you need to move one of the vertical separators to expose a hidden pane. As described in the section Getting Started with DUCView, the left pane shows the peer summary. The center pane shows the pyramid SCU labels and weights; the number in parentheses to the left of each SCU label is the weight, meaning the number of summaries that contributed to the SCU. The right pane shows the model summaries; if you select an SCU in the center pane, the contributors from the model summaries in the right pane will be highlighted in yellow. An SCU of weight 4 will have four contributors, one in each model summary. If a single model summary has two or more spans of text highlighted in yellow, these discontinuous spans form a single contributor.

Load a peer summary. Go through the process of matching peer text against SCU labels until you can find no more matches. Save the annotation file. Repeat until you have completed all peers. Note that the annotation goes more quickly the more familiar you become with the pyramid.

  1. It is recommended that you always start by using the "Edit > Autoannotate" option first. This will automatically annotate text that is an identical match to text in the pyramid, or text in a previously annotated peer.
  2. Try to evaluate each clause, or each chunk of information, in the peer to see if it matches the meaning of an SCU. The match does not have to be exact, but should convey much the same information. (See Example 4; for a more difficult case, see Example 6.)
  3. A common type of mismatch is tense or modality; since the summaries derive from clusters of news articles that may cover many days or weeks, different summaries can refer to the same event as being in the past or the future.
  4. If a peer summary repeats the same information, then if the information matches an SCU X, there must be the same number of matches to SCU X as there are repetitions of the information in the peer.
  5. A single text selection should not match more than one SCU. However, you can make partly overlapping text selections, especially to clarify why a given SCU is matched. For example, with conjunction, such as "a Welsh parliament with law-making and financial powers," you can select "a Welsh parliament with law-making ... powers" (the elipses indicates two selections), and "a Welsh parliament with ... financial powers" (ditto) if there are two corresponding SCUs. (See Example 2 and Example 3.)
  6. If a span of text seems to match multiple SCUs that differ in weight, be especially careful in your choice, as it will affect the score. The weight of an SCU is the number in parentheses next to the label, shown in the center "pyramid" pane of the DUCView annotation interface. If you are not sure what differentiates the two SCUs, go beyond reading the labels to reading the highlighted text in the far right pane, which shows the text selections used to creat the SCU from the model summaries.
  7. If an SCU and some information expressed in the peer do not have the same specificity, they can still match, but refrain from cases where the peer summary expresses something much more vague or abstract than the SCU you are considering. For example, consider "Elected Welsh assembly gains favour" in a peer; the sentence does not indicate who an elected assembly gains favour with or why. As a result, it is too general to match anything in the D633-CDEH pyramid; the closest match is an SCU labelled Elected Welsh assembly would give Wales more autonomy, but the peer sentence does not mention or imply the role of autonomy. (See Example 5.)

Back to Beginning