View Article |
Software engineering in an effective collaborative environment: An evaluative study on Crowdsourcing platforms
Hani Al-Bloush1, Badariah Solemon2.
Crowdsourcing gathers the world’s software engineering experts on a specific subject matter, and allows organisations and individuals to employ the combined effort of these ‘experts’ to accomplish the software task at hand. However, leveraging the knowledge of experts will not be achieved without online crowdsourcing platforms, which makes communication possible. This study intends to evaluate the performance of four Crowdsourced Software Engineering(CSE) platforms (TopCoder, InnoCentive, AMT and Upwork) based on the criteria of the Web of System Performance (WOSP) model. The WOSP criteria include functionality, usability, security, extendibility, reliability, flexibility, connectivity and privacy. Findings from the analyses showed that the four CSE platforms vary in all of their features, and at the same time, they all lack the requirements of flexibility. The results provide insight into the current status of CSE platforms and highlight the gaps inherent in these platforms while offering a more complete picture. This study contributes to work on enhancing the design of current and future platforms.
Affiliation:
- Universiti Tenaga Nasional, Malaysia
- Universiti Tenaga Nasional, Malaysia
Download this article (This article has been downloaded 281 time(s))
|
|
Indexation |
Indexed by |
MyJurnal (2021) |
H-Index
|
3 |
Immediacy Index
|
0.000 |
Rank |
0 |
Indexed by |
Scopus 2020 |
Impact Factor
|
CiteScore (1.1) |
Rank |
Q3 (Agricultural and Biological Sciences (all)) Q3 (Environmental Science (all)) Q3¬¬- (Computer Science (all)) Q3 (Chemical Engineering (all)) |
Additional Information |
SJR (0.174) |
|
|
|