• Medientyp: E-Artikel
  • Titel: The Three Sides of CrowdTruth
  • Beteiligte: Aroyo, Lora; Welty, Chris
  • Erschienen: Human Computation Institute, 2014
  • Erschienen in: Human Computation
  • Sprache: Nicht zu entscheiden
  • DOI: 10.15346/hc.v1i1.3
  • ISSN: 2330-8001
  • Schlagwörter: Electrical and Electronic Engineering ; Building and Construction
  • Entstehung:
  • Anmerkungen:
  • Beschreibung: <jats:p>Crowdsourcing is often used to gather annotated data for training and evaluating computational systems that attempt to solve cognitive problems, such as understanding Natural Language sentences. Crowd workers are asked to perform semantic interpretation of sentences to establish a ground truth. This has always been done under the assumption that each task unit, e.g. each sentence, has a single correct interpretation that is contained in the ground truth. We have countered this assumption with CrowdTruth, and have shown that it can be better suited to tasks for which semantic interpretation is subjective. In this paper we investigate the dependence of worker metrics for detecting spam on the quality of sentences in the dataset, and the quality of the target semantics. We show that worker quality metrics can improve significantly when the quality of these other aspects of semantic interpretation are considered.</jats:p>
  • Zugangsstatus: Freier Zugang