Assessment of Different Workflow Strategies for Annotating Discourse Relations: A Case Study with HDRB.

CICLing (1)(2013)

Cited 2|Views0
No score
Abstract
In this paper we present our experiments with different annotation workflows for annotating discourse relations in the Hindi Discourse Relation Bank(HDRB). In view of the growing interest in the development of discourse data-banks based on the PDTB framework and the complexities associated with the discourse annotation, it is important to study and analyze approaches and practices followed in the annotation process. The ultimate goal is to find an optimal balance between accurate description of discourse relations and maximal inter-rater reliability. We address the question of the choice of annotation work-flow for discourse and how it affects the consistency and hence the quality of annotation. We conduct multiple annotation experiments using different work-flow strategies, and evaluate their impact on inter-annotator agreement. Our results show that the choice of annotation work-flow has a significant effect on the annotation load and the comprehension of discourse relations for annotators, as is reflected in the inter-annotator agreement results.
More
Translated text
Key words
annotation work-flow,discourse relation,annotation load,annotation process,different annotation workflows,discourse annotation,multiple annotation experiment,annotating discourse relation,different work-flow strategy,inter-annotator agreement,case study,different workflow strategy
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined