Sciweavers

28 search results - page 2 / 6
» Optimal crowdsourcing contests
Sort
View
CIDR
2011
228views Algorithms» more  CIDR 2011»
12 years 8 months ago
Crowdsourced Databases: Query Processing with People
Amazon’s Mechanical Turk (“MTurk”) service allows users to post short tasks (“HITs”) that other users can receive a small amount of money for completing. Common tasks on...
Adam Marcus 0002, Eugene Wu 0002, Samuel Madden, R...
CIDM
2007
IEEE
13 years 11 months ago
Handwritten Digit, Recognition Road to Contest victory
— With growing amount of data gathered nowadays, the need for efficient data mining methodologies is getting more and more common. There is a large number of different classifi...
Norbert Jankowski, Krzysztof Grabczewski
AAAI
2010
13 years 6 months ago
Decision-Theoretic Control of Crowd-Sourced Workflows
Crowd-sourcing is a recent framework in which human intelligence tasks are outsourced to a crowd of unknown people ("workers") as an open call (e.g., on Amazon's Me...
Peng Dai, Mausam, Daniel S. Weld
AAAI
2012
11 years 7 months ago
Online Task Assignment in Crowdsourcing Markets
We explore the problem of assigning heterogeneous tasks to workers with different, unknown skill sets in crowdsourcing markets such as Amazon Mechanical Turk. We first formalize ...
Chien-Ju Ho, Jennifer Wortman Vaughan
SEMWEB
2007
Springer
13 years 11 months ago
OLA in the OAEI 2007 Evaluation Contest
Abstract. Similarity has become a classical tool for ontology confrontation motivated by alignment, mapping or merging purposes. In the definition of an ontologybased measure one ...
Jean François Djoufak Kengue, Jér&oc...