I’ve been asked to present my research at the IS/Analytics seminar at Stevens Institute of Technology. I’ll be presenting our current on-going work entitled “Making the Crowd Wiser: (Re)Combination through Teaming in Crowdsourcing”. The abstract is below.
Firms as solution seekers are increasingly adopting crowdsourcing to acquire innovative solutions to challenging problems. Teaming in crowdsourcing practice has shown promise for developing effective solutions by (re)combining knowledge of diverse solvers. But there has been a paucity of theory on the process that (re)combines diverse solvers via teaming. First, due to the uncertainty of teaming up with strangers, it is not clear how solvers decide to team up with others will impact their eventual performance. Second, since teaming inherently reduces the number of solutions submitted by the crowd, it is not clear whether and how teaming will impact the quality of the solutions acquired by the solution seeker. In this paper, we theorize platform features as different quality signals for (re)combination and systematically explore: a) how should solvers use different features that signal the quality of (re)combination to improve performance?, and b) how solvers’ use of signals for (re)combination affects crowdsourcing effectiveness to develop a comprehensive theory from a (re)combination perspective. Using simulation experiments, we find that different signals highlight the instant return of solution integration and potential return of teamwork, albeit conditionally depending on problem complexity and timing of team formation. More interestingly, we find a potential misalignment between solver- and seeker-level outcomes. While solution solvers can increase their performance by selecting suitable signals for teaming, the solution seeker may suffer from a lower likelihood of finding high-quality solutions. These findings provide new insights on solver performance and crowdsourcing effectiveness and have implications for the design of crowdsourcing platforms.