Please use this identifier to cite or link to this item:
|dc.description.abstract||Crowdsourcing gathers the world’s software engineering experts on a specific subject matter, and allows organisations and individuals to employ the combined effort of these ‘experts’ to accomplish the software task at hand. However, leveraging the knowledge of experts will not be achieved without online crowdsourcing platforms, which makes communication possible. This study intends to evaluate the performance of four Crowdsourced Software Engineering(CSE) platforms (TopCoder, InnoCentive, AMT and Upwork) based on the criteria of the Web of System Performance (WOSP) model. The WOSP criteria include functionality, usability, security, extendibility, reliability, flexibility, connectivity and privacy. Findings from the analyses showed that the four CSE platforms vary in all of their features, and at the same time, they all lack the requirements of flexibility. The results provide insight into the current status of CSE platforms and highlight the gaps inherent in these platforms while offering a more complete picture. This study contributes to work on enhancing the design of current and future platforms. © 2017 Universiti Putra Malaysia Press.|
|dc.title||Software engineering in an effective collaborative environment: An evaluative study on crowdsourcing platforms||en_US|
|Appears in Collections:||UNITEN Scholarly Publication|
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.