Associated Professor Matthew Lease (left) and UT Austin doctoral student Tyler McDonnell
A research article co-authored by School of Information Associate Professor Matthew Lease received the Best Paper Award from the 2016 Conference on Human Computation and Crowdsourcing (HCOMP) this November.
In their award-winning paper, Prof. Lease and his colleagues demonstrated a new technique to ensure the validity of data collected via the Internet. While online crowdsourcing enables a vast scale of people around the world to contribute, such broad participation also exposes data collection projects to greater variability and risk. With subjective tasks, such as rating the relevance of web search results, quality assurance can be particularly difficult because people have many different opinions of what may be considered relevant.
To address this challenge, the team investigated a basic idea—requiring online workers to provide a rationale justifying each rating decision they made. This idea proved to be remarkably effective for its simplicity: the researchers found they could significantly enhance the reliability and accountability of collected data, as well as its transparency and value, at no added cost.
“Over a series of experiments collecting nearly 10,000 relevance judgments, we found that workers produce higher quality relevance judgments simply by being asked to provide a rationale, and prolific workers require virtually no extra time to provide rationales in addition to ratings,” McDonnell wrote in a blog post about the research.