Loading [MathJax]/extensions/MathMenu.js
Optimization Algorithm Balancing Output and Fairness in Crowdsourcing | IEEE Conference Publication | IEEE Xplore

Optimization Algorithm Balancing Output and Fairness in Crowdsourcing


Abstract:

Crowdsourcing has become increasingly popular in recent years as it enables requesters to find a group of workers to work on small tasks that an individual or organizatio...Show More

Abstract:

Crowdsourcing has become increasingly popular in recent years as it enables requesters to find a group of workers to work on small tasks that an individual or organization cannot easily do. One of the main challenges in crowdsourcing is to ensure worker participation. Many papers propose incentive mechanisms among which a few are long-term incentives. However, none of them have combined incentives with workers' output. In this paper, we address not only the long-term incentive issue but also the workers' time-average output maximization issue by formulating a stochastic optimization problem. In our problem, while maximizing workers' output is an explicit objective, the long-term incentive is realized through the requester's fairness towards workers. We solve the problem using the Lyapunov technique and turn the solution into interactive but independent optimization decisions on the side of the workers and the requester in each time slot. To evaluate the performance of our solution, we conduct theoretical analysis first and then simulations to compare our solution with theoretical values and two other variations. Analysis and simulation results show that our solution can maximize workers' time-average output while ensuring fairness to retain the workers in the long run.
Date of Conference: 03-05 December 2024
Date Added to IEEE Xplore: 12 March 2025
ISBN Information:
Conference Location: NY, USA

I. Introduction

Crowdsourcing [1] has gained popularity in recent years because it allows requesters to find a group of workers online to work on small tasks that an individual or organization cannot easily do. There are three basic components in crowdsourcing: requesters who publish tasks on a platform, workers who carry out the tasks, and a platform such as Amazon Mechanical Turk [2] that matches requesters and workers.

Contact IEEE to Subscribe

References

References is not available for this document.