ul Hassan, Umair, Zaveri, Amrapali, Marx, Edgard, Curry, Edward and Lehmann, Jens (2016) ACRyLIQ: Leveraging DBpedia for Adaptive Crowdsourcing in Linked Data Quality Assessment. Lecture Notes in Computer Science. ISSN 0302-9743
Preview
Uuh_ACRyLIQ.pdf
Download (569kB) | Preview
Abstract
Crowdsourcing has emerged as a powerful paradigm for quality assessment and improvement of Linked Data. A major challenge of employing crowdsourcing, for quality assessment in Linked Data, is the cold-start problem: how to estimate the reliability of crowd workers and assign the most reliable workers to tasks? We address this challenge by proposing a novel approach for generating test questions from DBpedia based on the topics associated with quality assessment tasks. These test questions are used to estimate the reliability of the new workers. Subsequently, the tasks are dynamically assigned to reliable workers to help improve the accuracy of collected responses. Our proposed approach, ACRyLIQ, is evaluated using workers hired from Amazon Mechanical Turk, on two real-world Linked Data datasets. We validate the proposed approach in terms of accuracy and compare it against the baseline approach of reliability estimate using gold-standard task. The results demonstrate that our proposed approach achieves high accuracy without using gold-standard task.
Item Type: | Article |
---|---|
Keywords: | Link Data; Task Assignment; Test Question; Overhead Cost; Assignment Algorithm; |
Academic Unit: | Faculty of Social Sciences > School of Business |
Item ID: | 16010 |
Identification Number: | 10.1007/978-3-319-49004-5_44 |
Depositing User: | Souleiman Hasan |
Date Deposited: | 30 May 2022 11:45 |
Journal or Publication Title: | Lecture Notes in Computer Science |
Publisher: | Springer Verlag |
Refereed: | Yes |
Related URLs: | |
URI: | https://mural.maynoothuniversity.ie/id/eprint/16010 |
Use Licence: | This item is available under a Creative Commons Attribution Non Commercial Share Alike Licence (CC BY-NC-SA). Details of this licence are available here |
Repository Staff Only (login required)
Downloads
Downloads per month over past year