Crowdsourcing Statement Classification to Enhance Information Quality Prediction

Abstract

Due to their relatively low cost and ability to scale, crowdsourcing based approaches are widely used to collect a large amount of human annotated data. To this aim, multiple crowdsourcing platforms exist, where requesters can upload tasks and workers can carry them out and obtain payment in return. Such platforms share a task design and deploy workflow that is often counter-intuitive and cumbersome. To address this issue, we propose Crowd_Frame, a simple and complete framework which allows to develop and deploy diverse types of complex crowdsourcing tasks in an easy and customizable way. We show the abilities of the proposed framework and we make it available to researchers and practitioners.

Publication
Proceedings of the 6th Multidisciplinary International Symposium on Disinformation in Online Open Media (MISDOOM 2024). Münster, Germany. Accepted for publication on June 21, 2024.