Overview

AAAI HCOMP is the premier venue for disseminating the latest research findings on crowdsourcing and human computation. While artificial intelligence (AI) and human-computer interaction (HCI) represent traditional mainstays of the conference, HCOMP believes strongly in inviting, fostering, and promoting broad, interdisciplinary research. This field is particularly unique in the diversity of disciplines it draws upon, and contributes to, ranging from human-centered qualitative studies and HCI design, to computer science and artificial intelligence, economics and the social sciences, all the way to policy and ethics. We promote the exchange of scientific advances in human computation and crowdsourcing not only among researchers, but also engineers and practitioners, to encourage dialogue across disciplines and communities of practice.

Past meetings include four AAAI HCOMP conferences (2013-2016) and four earlier workshops, held at the AAAI Conference on Artificial Intelligence (2011-2012), and the ACM SIGKDD Conference on Knowledge Discovery and Data Mining (2009-2010).

An overview of the history, goals, and peer review procedures of the conference can be found in the preface to the HCOMP-13 proceedings. Additional background on the founding of the conference is discussed in a Computing Research News story.

Past Meetings


HCOMP 2013 Photo Album

Past Proceedings

Past Paper Awards

  • 2016

Best Paper Award
Why Is That Relevant? Collecting Annotator Rationales for Relevance Judgments
Tyler Mcdonnell, Matthew Lease, Mucahid Kutlu and Tamer Elsayed

Best Paper Finalists
Efficient Techniques for Crowdsourced Top-k Lists
Luca de Alfaro, Vassilis Polychronopoulos and Neoklis Polyzotis

Extending WOrkers' Attention Span Through Dummy Events
Avshalom Elmalech, David Sarne, Esther David and Chen Hajaj

  • 2015

Best Paper Award
Crowdsourcing in the Field: A Case Study Using Local Crowds for Event Reporting
Elena Agapie, Jaime Teevan, and Andres Monroy-Hernandez

Best Paper Finalists
Tropel: Crowdsourcing Detectors with Minimal Training
Genevieve Patterson, Grant Van Horn, Serge Belongie, Pietro Perona, and James Hays

Crowdsourcing from Scratch: A Pragmatic Experiment in Data Collection by Novice Requesters
Alexandra Papoutsaki, Hua Guo, Metaxa-Kakavouli, Connor Gramazio, Jeff Rasley, Wenting Xie, Guan Wang, and Jeff Huang

  • 2014

Best Paper Award
STEP: A Scalable Testing and Evaluation Platform
Maria Christoforaki and Panos Ipeirotis

Best Paper Finalists
A Crowd of Your Own: Crowdsourcing for On-Demand Personalization
Peter Organisciak, Jaime Teevan, Susan Dumais, Robert Miller, and Adam Tauman Kalai

Crowdsourcing for Participatory Democracies: Efficient Elicitation of Social Choice Functions
David Lee, Ashish Goel, Tanja Aitamurto, and Helene Landemore

  • 2013

Best Paper Award
Crowdsourcing Multi-Label Classification for Taxonomy Creation
Jonathan Bragg, Mausam, and Daniel S. Weld

Best Paper Finalists
nEmesis: Which Restaurants Should You Avoid Today?
Adam Sadilek, Sean Brennan, Henry Kautz, and Vincent Silenzio

Community Clustering: Leveraging an Academic Crowd to Form Coherent Conference Sessions
Paul Andre, Haoqi Zhang, Juho Kim, Lydia Chilton, Steven P. Dow, and Robert C. Miller

Library

  • Past Reports

Workshops Held at the First AAAI Conference on Human Computation and Crowdsourcing: A Report. Tatiana Josephy, Matthew Lease, Praveen Paritosh, Markus Krause, Mihai Georgescu, Michael Tjalve, and Daniela Braga. AI Magazine, 35(2), 75-78, 2014.

Shar Steed. Harnessing Human Intellect for Computing. Computing Research Association (CRA) Computing Research News, Vol. 25 No. 2, Feburary 2013.

A report on the human computation workshop (HCOMP 2009). Panagiotis G. Ipeirotis, Raman Chandrasekar, and Paul N. Bennett. SIGKDD Explorations 11, no. 2 (2009): 80-83.