SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation

 

 



 

 

 

Overview

Program

Call for Participation

Organizers


Attend the workshop July 23, 2010

Registration and Program



Call for Papers

The SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation (CSE2010) solicits submissions on topics including but are not limited to the following areas:

 

    * Novel applications of crowdsourcing for evaluating search systems (see examples below)

    * Novel theoretical, experimental, and/or methodological developments advancing state-of-the-art knowledge of crowdsourcing for search evaluation

    * Tutorials on how the different forms of crowdsourcing might be best suited to or best executed in evaluating different search tasks

    * New software packages which simplify or otherwise improve general support for crowdsourcing, or particular support for crowdsourced search evaluation

    * Reflective or forward-looking vision on use of crowdsourcing in search evaluation as informed by prior and/or ongoing studies

    * How crowdsourcing technology or process can be adapted to encourage and facilitate more participation from outside the USA

The workshop especially calls for innovative solutions in the area of search evaluation involving significant use of a crowdsourcing platform such as Amazon's Mechanical Turk, Livework, Crowdflower, DoMyStuff, etc. Novel applications of crowdsourcing are of particular interest. This includes but is not restricted to the following tasks:

    * cross-vertical search (video, image, blog, etc.) evaluation, 

    * local search evaluation  

    * mobile search evaluation

    * realtime/news search evaluation

    * entity search evaluation

    * discovering representative groups of rare queries, documents, and events in the long-tail of search

    * detecting/evaluating query alterations

 

For example, does the inherent geographic dispersal of crowdsourcing enable better assessment of a query's local intent, its local-specific facets, or diversity of returned results?  Could crowd-sourcing be employed in near real-time to better assess query intent for breaking news and relevant information?

 

Most Innovative Awards --- Sponsored by Microsoft Bing

  

As further incentive to participation, authors of the most novel and innovative crowdsourcing-based search evaluation techniques (e.g. using crowdsourcing platform such as those named above) will be recognized with "Most Innovative Awards" as judged by the workshop organizers. Selection will be based on the creativity, originality, and potential impact of the described proposal, and we expect the winners to describe risky, ground-breaking, and unexpected ideas. The provision of awards is thanks to generous support from Microsoft Bing, and the number and nature of the awards will depend on the quality of the submissions and overall availability of funds. All valid submissions to the workshop will be considered for the awards.

 

Submission Instructions

 

Submissions should report new (unpublished) research results or ongoing research. Long paper submissions (up to 8 pages) will be primarily target oral presentations. Short papers submissions can be up to 4 pages long, and will primarily target poster presentations. Papers should be formatted in double-column ACM SIG proceedings format. Papers must be submitted as PDF files. Submissions should not be anonymized.

Papers should be submitted through the Workshop's EasyChair system. If you do not have an EasyChair account you will need to create one in order to submit a paper.

Important Dates

 

Submissions due: June 10, 2010

Notification of acceptance: June 25, 2010

Camera-ready submission: July 1, 2010

Workshop date: July 23, 2010

 

Questions?

 

Email the organizers at cse2010@ischool.utexas.edu

 

v