Loading…

A distributed bug analyzer based on user-interaction features for mobile apps

Developers must spend more effort and attention on the processes of software development to deliver quality applications to the users. Software testing and automation play a strategic role in ensuring the quality of mobile applications. This paper proposes and evaluates a Distributed Bug Analyzer ba...

Full description

Saved in:
Bibliographic Details
Published in:Journal of ambient intelligence and humanized computing 2017-08, Vol.8 (4), p.579-591
Main Authors: Méndez-Porras, Abel, Méndez-Marín, Giovanni, Tablada-Rojas, Alberto, Hidalgo, Mario Nieto, García-Chamizo, Juan Manuel, Jenkins, Marcelo, Martínez, Alexandra
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Developers must spend more effort and attention on the processes of software development to deliver quality applications to the users. Software testing and automation play a strategic role in ensuring the quality of mobile applications. This paper proposes and evaluates a Distributed Bug Analyzer based on user-interaction features that uses digital imaging processing to find bugs. Our Distributed Bug Analyzer detects bugs by comparing the similarity between images taken before and after an user-interaction feature occurs. An interest point detector and descriptor is used for image comparison. To evaluate the Distribute Bug Analyzer, we conducted a case study with 38 randomly selected mobile applications. First, we identified user-interaction bugs by manually testing the applications. Images were captured before and after applying each user-interaction feature. Then, image pairs were processed (using SURF) to obtain interest points, from which a similarity percentage was computed, to identify the presence of bugs. We used a Master Computer, a Storage Test Database, and four Slave Computers to evaluate the Distributed Bug Analyzer. We performed 360 tests of user-interaction features in total. We found 79 bugs when manually testing user-interaction features, and 69 bugs when using digital imaging processing to detect bugs with a threshold fixed at 92.5% of similarity. Distributed Bug Analyzer evenly distributed tests that are pending in the Storage Test Database between the Slave Computers. Slave Computers 1, 2, 3, and 4 processed 21, 20, 23, and 36% of image pair respectively.
ISSN:1868-5137
1868-5145
DOI:10.1007/s12652-016-0435-7