Please use this identifier to cite or link to this item: http://dspace.uniten.edu.my/jspui/handle/123456789/11327
DC FieldValueLanguage
dc.contributor.authorAl-Bloush, H.en_US
dc.contributor.authorSolemon, B.en_US
dc.date.accessioned2018-12-14T02:42:47Z-
dc.date.available2018-12-14T02:42:47Z-
dc.date.issued2017-
dc.description.abstractCrowdsourcing gathers the world’s software engineering experts on a specific subject matter, and allows organisations and individuals to employ the combined effort of these ‘experts’ to accomplish the software task at hand. However, leveraging the knowledge of experts will not be achieved without online crowdsourcing platforms, which makes communication possible. This study intends to evaluate the performance of four Crowdsourced Software Engineering(CSE) platforms (TopCoder, InnoCentive, AMT and Upwork) based on the criteria of the Web of System Performance (WOSP) model. The WOSP criteria include functionality, usability, security, extendibility, reliability, flexibility, connectivity and privacy. Findings from the analyses showed that the four CSE platforms vary in all of their features, and at the same time, they all lack the requirements of flexibility. The results provide insight into the current status of CSE platforms and highlight the gaps inherent in these platforms while offering a more complete picture. This study contributes to work on enhancing the design of current and future platforms. © 2017 Universiti Putra Malaysia Press.
dc.language.isoenen_US
dc.titleSoftware engineering in an effective collaborative environment: An evaluative study on crowdsourcing platformsen_US
dc.typeArticleen_US
item.fulltextNo Fulltext-
item.grantfulltextnone-
Appears in Collections:UNITEN Scholarly Publication
Show simple item record

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.