Description | Paccar Hall, Room 290 Seminar Speaker: Xinxin Li Affiliation: University of Connecticut Area: Information Systems Name of Presentation: Salience Effect in Crowdsourcing Contests Abstract: Online platforms typically allow online users to contribute freely to the community. However, without appropriate control, the behavior of the online community might not align with the platform’s designed objective, which can lead to inferior outcomes in crowdsourcing performance. This paper investigates how the feedback information and systematic bias of crowdsourcing workers can affect crowdsourcing outcomes. Specifically, we examined the role of a systematic bias, namely the salience effect, in influencing the performance of a crowdsourcing platform (i.e., Kaggle) and how the number of crowdsourcing workers moderates the impact of the salience effect as a result of a parallel path effect and competition effect. Our results suggest that the salience effect influences the performance of contestants, including the winners of the contests. Furthermore, the parallel path effect cannot completely eliminate the impact of the salience effect, but it can attenuate it to a certain extent. By contrast, the competition effect is likely to amplify the impact of the salience effect. Our results have important implications for crowdsourcing firms and platform designers. |
---|