CrowdCamp is a one-day hack-a-thon for crowdsourcing, human computation, social media, collective intelligence ideas. The focus is on creating a deliverable prototype or study design within the workshop itself. Prior CrowdCamp projects over the last four years have resulted in top-tier conference publications, blog posts, and on-going research.
We invite students, faculty, and industry researchers to participate, from social scientists to programmers, ethnographers to designers, and anyone else interested. Everyone is welcome!
This year's CrowdCamp will highlight the human aspect of crowdsourcing. It will focus on the human elements of the systems built and data collected. All participants are therefore encouraged to embrace ambiguity, uncertainty, and chaos as vital elements of their projects, as well as to incorporate rigorous metrics and methodologies from human subjects research, e.g., psychology, sociology and survey design.
The application should take about 10 minutes. It asks for:
- One hack-a-thon idea This may be viewed by other participants, but if you are accepted to CrowdCamp, you will still be free to change your idea. We are looking for ideas that are specific rather than vague. The goal is to generate ideas that a group of 3-5 people can actually finish in 2 days.
- A description of what interests you about crowdsourcing We are looking for a diverse group of people that will develop amazing projects during CrowdCamp.
- Your contact information
Sign Up Form
- October 1, 2017: CrowdCamp Sign-up deadline (all applications accepted)
- October 24, 2017: Workshop (full-day)
TBD
Past Papers and Technical Reports
Past CrowdCamps have shown a strong track-record of bringing together diverse teams of researchers, engineers, and students who work passionately together on well-scoped project ideas. Many participants have continued working on their projects beyond CrowdCamp, generating impactful publications and fruitful collaborations. A few examples include:
- Subcontracting Microwork, Meredith Ringel Morris, Jeffrey P. Bigham, Robin Brewer, Jonathan Bragg, Anand Kulkarni, Jessie Li, Saiph Savage, ACM CHI 2017
- Possible Confounds in Word-based Semantic Similarity Test Data, Malay Bhattacharyya, Yoshihiko Suhara, Md Mustafizur Rahman, Markus Krause, CSCW 2017
- Worker-Owned Cooperative Models for Training Artificial Intelligence, Anand Sriraman, Jonathan Bragg, Anand Kulkarni, CSCW 2017
- Computer Supported Collective Action, Aaron Shaw, Haoqi Zhang, Andrés Monroy-Hernández, Sean Munson, Benjamin Hill, Elizabeth Gerber, Peter Kinnaird, Patrick Minder, ACM Interactions, March 2014
- Chorus, Walter S. Lasecki, Rachel Wesley, Jeffrey Nichols, Anand Kulkarni, James F. Allen, Jeffrey P. Bigham, ACM UIST 2013
- Mechanical Turk is Not Anonymous, Matthew Lease, Jessica Hullman, Jeffrey P. Bigham, Michael S. Bernstein, Juho Kim, Walter Lasecki, Saeideh Bakhshi, Tanushree Mitra, Robert C. Miller, Social Science Research Network (SSRN) Online, March 6, 2013
- The Future of Crowd Work, Aniket Kittur, Jeffrey V. Nickerson, Michael Bernstein, Elizabeth Gerber, Aaron Shaw, John Zimmerman, Matt Lease, John Horton, ACM CSCW 2013