Loading…

Evaluating Test Suites and Adequacy Criteria Using Simulation-Based Models of Distributed Systems

Test adequacy criteria provide the engineer with guidance on how to populate test suites. While adequacy criteria have long been a focus of research, existing testing methods do not address many of the fundamental characteristics of distributed systems, such as distribution topology, communication f...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on software engineering 2008-07, Vol.34 (4), p.452-470
Main Authors: Rutherford, M.J., Carzaniga, A., Wolf, A.L.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c375t-8840e8085257bc80eb21700374b7385a82550408756f1794cf269d630c4c97783
cites cdi_FETCH-LOGICAL-c375t-8840e8085257bc80eb21700374b7385a82550408756f1794cf269d630c4c97783
container_end_page 470
container_issue 4
container_start_page 452
container_title IEEE transactions on software engineering
container_volume 34
creator Rutherford, M.J.
Carzaniga, A.
Wolf, A.L.
description Test adequacy criteria provide the engineer with guidance on how to populate test suites. While adequacy criteria have long been a focus of research, existing testing methods do not address many of the fundamental characteristics of distributed systems, such as distribution topology, communication failure, and timing. Furthermore, they do not provide the engineer with a means to evaluate the relative effectiveness of different criteria, nor the relative effectiveness of adequate test suites satisfying a given criterion. This paper makes three contributions to the development and use of test adequacy criteria for distributed systems: (1) a testing method based on discrete-event simulations; (2) a fault-based analysis technique for evaluating test suites and adequacy criteria; and (3) a series of case studies that validate the method and technique. The testing method uses a discrete-event simulation as an operational specification of a system, in which the behavioral effects of distribution are explicitly represented. Adequacy criteria and test cases are then defined in terms of this simulation-based specification. The fault-based analysis involves mutation of the simulation-based specification to provide a foil against which test suites and the criteria that formed them can be evaluated. Three distributed systems were used to validate the method and technique, including DNS, the domain name system.
doi_str_mv 10.1109/TSE.2008.33
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_34473498</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>4527254</ieee_id><sourcerecordid>34473498</sourcerecordid><originalsourceid>FETCH-LOGICAL-c375t-8840e8085257bc80eb21700374b7385a82550408756f1794cf269d630c4c97783</originalsourceid><addsrcrecordid>eNqF0TtPwzAQB3ALgUR5TIwsFgMMKOX8iu0RSnlIIIa2s-UmDnKVJmAnSP32OBQxMMBk6fTz2Xd_hE4IjAkBfTWfTccUQI0Z20EjopnOmKCwi0YAWmVCKL2PDmJcAYCQUoyQnX7Yuredb17x3MUOz3rfuYhtU-Lr0r33ttjgSUi14C1exMHN_Lqv05W2yW5sdCV-bktXR9xW-NbHLvhl36XqbBM7t45HaK-ydXTH3-chWtxN55OH7Onl_nFy_ZQVTIouU4qDU6AEFXJZKHBLSiQAk3wpmRJWUSGAg5Iir4jUvKhorsucQcELLaVih-hi2_cttO99GsWsfSxcXdvGtX00GljOhSDsX5neAKXZV8_zPyXjXDKuB3j2C67aPjRpXkN0WjulNE_ocouK0MYYXGXegl_bsDEEzJCfSfmZIT_Dhl-ebrV3zv1ILqikgrNP-_6TaA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>195582226</pqid></control><display><type>article</type><title>Evaluating Test Suites and Adequacy Criteria Using Simulation-Based Models of Distributed Systems</title><source>ABI/INFORM Global</source><source>IEEE Electronic Library (IEL) Journals</source><creator>Rutherford, M.J. ; Carzaniga, A. ; Wolf, A.L.</creator><creatorcontrib>Rutherford, M.J. ; Carzaniga, A. ; Wolf, A.L.</creatorcontrib><description>Test adequacy criteria provide the engineer with guidance on how to populate test suites. While adequacy criteria have long been a focus of research, existing testing methods do not address many of the fundamental characteristics of distributed systems, such as distribution topology, communication failure, and timing. Furthermore, they do not provide the engineer with a means to evaluate the relative effectiveness of different criteria, nor the relative effectiveness of adequate test suites satisfying a given criterion. This paper makes three contributions to the development and use of test adequacy criteria for distributed systems: (1) a testing method based on discrete-event simulations; (2) a fault-based analysis technique for evaluating test suites and adequacy criteria; and (3) a series of case studies that validate the method and technique. The testing method uses a discrete-event simulation as an operational specification of a system, in which the behavioral effects of distribution are explicitly represented. Adequacy criteria and test cases are then defined in terms of this simulation-based specification. The fault-based analysis involves mutation of the simulation-based specification to provide a foil against which test suites and the criteria that formed them can be evaluated. Three distributed systems were used to validate the method and technique, including DNS, the domain name system.</description><identifier>ISSN: 0098-5589</identifier><identifier>EISSN: 1939-3520</identifier><identifier>DOI: 10.1109/TSE.2008.33</identifier><identifier>CODEN: IESEDJ</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Adequacy ; Alliances ; Computational modeling ; Computer networks ; Computer simulation ; Computer Society ; Criteria ; Digital Object Identifier ; Discrete event simulation ; Domain Name System ; Domain names ; Engineers ; Failure ; Failure analysis ; Foils ; Mathematical models ; Network servers ; Quality control ; Simulation ; Software engineering ; Specification ; Specifications ; Studies ; System testing ; Systems development ; Test coverage of specifications ; Timing ; Topology ; Web server</subject><ispartof>IEEE transactions on software engineering, 2008-07, Vol.34 (4), p.452-470</ispartof><rights>Copyright IEEE Computer Society Apr 2008</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c375t-8840e8085257bc80eb21700374b7385a82550408756f1794cf269d630c4c97783</citedby><cites>FETCH-LOGICAL-c375t-8840e8085257bc80eb21700374b7385a82550408756f1794cf269d630c4c97783</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/195582226?pq-origsite=primo$$EHTML$$P50$$Gproquest$$H</linktohtml><link.rule.ids>314,776,780,11669,27903,27904,36039,36040,44342,54774</link.rule.ids></links><search><creatorcontrib>Rutherford, M.J.</creatorcontrib><creatorcontrib>Carzaniga, A.</creatorcontrib><creatorcontrib>Wolf, A.L.</creatorcontrib><title>Evaluating Test Suites and Adequacy Criteria Using Simulation-Based Models of Distributed Systems</title><title>IEEE transactions on software engineering</title><addtitle>TSE</addtitle><description>Test adequacy criteria provide the engineer with guidance on how to populate test suites. While adequacy criteria have long been a focus of research, existing testing methods do not address many of the fundamental characteristics of distributed systems, such as distribution topology, communication failure, and timing. Furthermore, they do not provide the engineer with a means to evaluate the relative effectiveness of different criteria, nor the relative effectiveness of adequate test suites satisfying a given criterion. This paper makes three contributions to the development and use of test adequacy criteria for distributed systems: (1) a testing method based on discrete-event simulations; (2) a fault-based analysis technique for evaluating test suites and adequacy criteria; and (3) a series of case studies that validate the method and technique. The testing method uses a discrete-event simulation as an operational specification of a system, in which the behavioral effects of distribution are explicitly represented. Adequacy criteria and test cases are then defined in terms of this simulation-based specification. The fault-based analysis involves mutation of the simulation-based specification to provide a foil against which test suites and the criteria that formed them can be evaluated. Three distributed systems were used to validate the method and technique, including DNS, the domain name system.</description><subject>Adequacy</subject><subject>Alliances</subject><subject>Computational modeling</subject><subject>Computer networks</subject><subject>Computer simulation</subject><subject>Computer Society</subject><subject>Criteria</subject><subject>Digital Object Identifier</subject><subject>Discrete event simulation</subject><subject>Domain Name System</subject><subject>Domain names</subject><subject>Engineers</subject><subject>Failure</subject><subject>Failure analysis</subject><subject>Foils</subject><subject>Mathematical models</subject><subject>Network servers</subject><subject>Quality control</subject><subject>Simulation</subject><subject>Software engineering</subject><subject>Specification</subject><subject>Specifications</subject><subject>Studies</subject><subject>System testing</subject><subject>Systems development</subject><subject>Test coverage of specifications</subject><subject>Timing</subject><subject>Topology</subject><subject>Web server</subject><issn>0098-5589</issn><issn>1939-3520</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2008</creationdate><recordtype>article</recordtype><sourceid>M0C</sourceid><recordid>eNqF0TtPwzAQB3ALgUR5TIwsFgMMKOX8iu0RSnlIIIa2s-UmDnKVJmAnSP32OBQxMMBk6fTz2Xd_hE4IjAkBfTWfTccUQI0Z20EjopnOmKCwi0YAWmVCKL2PDmJcAYCQUoyQnX7Yuredb17x3MUOz3rfuYhtU-Lr0r33ttjgSUi14C1exMHN_Lqv05W2yW5sdCV-bktXR9xW-NbHLvhl36XqbBM7t45HaK-ydXTH3-chWtxN55OH7Onl_nFy_ZQVTIouU4qDU6AEFXJZKHBLSiQAk3wpmRJWUSGAg5Iir4jUvKhorsucQcELLaVih-hi2_cttO99GsWsfSxcXdvGtX00GljOhSDsX5neAKXZV8_zPyXjXDKuB3j2C67aPjRpXkN0WjulNE_ocouK0MYYXGXegl_bsDEEzJCfSfmZIT_Dhl-ebrV3zv1ILqikgrNP-_6TaA</recordid><startdate>20080701</startdate><enddate>20080701</enddate><creator>Rutherford, M.J.</creator><creator>Carzaniga, A.</creator><creator>Wolf, A.L.</creator><general>IEEE</general><general>IEEE Computer Society</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7WY</scope><scope>7WZ</scope><scope>7X7</scope><scope>7XB</scope><scope>87Z</scope><scope>88E</scope><scope>88F</scope><scope>88I</scope><scope>88K</scope><scope>8AL</scope><scope>8FE</scope><scope>8FG</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>8FL</scope><scope>8G5</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FRNLG</scope><scope>FYUFA</scope><scope>F~G</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>K9.</scope><scope>L.-</scope><scope>L6V</scope><scope>M0C</scope><scope>M0N</scope><scope>M0S</scope><scope>M1P</scope><scope>M1Q</scope><scope>M2O</scope><scope>M2P</scope><scope>M2T</scope><scope>M7S</scope><scope>MBDVC</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>Q9U</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>20080701</creationdate><title>Evaluating Test Suites and Adequacy Criteria Using Simulation-Based Models of Distributed Systems</title><author>Rutherford, M.J. ; Carzaniga, A. ; Wolf, A.L.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c375t-8840e8085257bc80eb21700374b7385a82550408756f1794cf269d630c4c97783</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2008</creationdate><topic>Adequacy</topic><topic>Alliances</topic><topic>Computational modeling</topic><topic>Computer networks</topic><topic>Computer simulation</topic><topic>Computer Society</topic><topic>Criteria</topic><topic>Digital Object Identifier</topic><topic>Discrete event simulation</topic><topic>Domain Name System</topic><topic>Domain names</topic><topic>Engineers</topic><topic>Failure</topic><topic>Failure analysis</topic><topic>Foils</topic><topic>Mathematical models</topic><topic>Network servers</topic><topic>Quality control</topic><topic>Simulation</topic><topic>Software engineering</topic><topic>Specification</topic><topic>Specifications</topic><topic>Studies</topic><topic>System testing</topic><topic>Systems development</topic><topic>Test coverage of specifications</topic><topic>Timing</topic><topic>Topology</topic><topic>Web server</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Rutherford, M.J.</creatorcontrib><creatorcontrib>Carzaniga, A.</creatorcontrib><creatorcontrib>Wolf, A.L.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>ABI/INFORM Collection</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection</collection><collection>Medical Database (Alumni Edition)</collection><collection>Military Database (Alumni Edition)</collection><collection>Science Database (Alumni Edition)</collection><collection>Telecommunications (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>Research Library (Alumni Edition)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>Business Premium Collection (Alumni)</collection><collection>Health Research Premium Collection</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>ABI/INFORM Professional Advanced</collection><collection>ProQuest Engineering Collection</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Military Database</collection><collection>Research Library</collection><collection>Science Database</collection><collection>Telecommunications Database</collection><collection>Engineering Database</collection><collection>Research Library (Corporate)</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>One Business (ProQuest)</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE transactions on software engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Rutherford, M.J.</au><au>Carzaniga, A.</au><au>Wolf, A.L.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Evaluating Test Suites and Adequacy Criteria Using Simulation-Based Models of Distributed Systems</atitle><jtitle>IEEE transactions on software engineering</jtitle><stitle>TSE</stitle><date>2008-07-01</date><risdate>2008</risdate><volume>34</volume><issue>4</issue><spage>452</spage><epage>470</epage><pages>452-470</pages><issn>0098-5589</issn><eissn>1939-3520</eissn><coden>IESEDJ</coden><abstract>Test adequacy criteria provide the engineer with guidance on how to populate test suites. While adequacy criteria have long been a focus of research, existing testing methods do not address many of the fundamental characteristics of distributed systems, such as distribution topology, communication failure, and timing. Furthermore, they do not provide the engineer with a means to evaluate the relative effectiveness of different criteria, nor the relative effectiveness of adequate test suites satisfying a given criterion. This paper makes three contributions to the development and use of test adequacy criteria for distributed systems: (1) a testing method based on discrete-event simulations; (2) a fault-based analysis technique for evaluating test suites and adequacy criteria; and (3) a series of case studies that validate the method and technique. The testing method uses a discrete-event simulation as an operational specification of a system, in which the behavioral effects of distribution are explicitly represented. Adequacy criteria and test cases are then defined in terms of this simulation-based specification. The fault-based analysis involves mutation of the simulation-based specification to provide a foil against which test suites and the criteria that formed them can be evaluated. Three distributed systems were used to validate the method and technique, including DNS, the domain name system.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/TSE.2008.33</doi><tpages>19</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0098-5589
ispartof IEEE transactions on software engineering, 2008-07, Vol.34 (4), p.452-470
issn 0098-5589
1939-3520
language eng
recordid cdi_proquest_miscellaneous_34473498
source ABI/INFORM Global; IEEE Electronic Library (IEL) Journals
subjects Adequacy
Alliances
Computational modeling
Computer networks
Computer simulation
Computer Society
Criteria
Digital Object Identifier
Discrete event simulation
Domain Name System
Domain names
Engineers
Failure
Failure analysis
Foils
Mathematical models
Network servers
Quality control
Simulation
Software engineering
Specification
Specifications
Studies
System testing
Systems development
Test coverage of specifications
Timing
Topology
Web server
title Evaluating Test Suites and Adequacy Criteria Using Simulation-Based Models of Distributed Systems
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-28T00%3A22%3A19IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Evaluating%20Test%20Suites%20and%20Adequacy%20Criteria%20Using%20Simulation-Based%20Models%20of%20Distributed%20Systems&rft.jtitle=IEEE%20transactions%20on%20software%20engineering&rft.au=Rutherford,%20M.J.&rft.date=2008-07-01&rft.volume=34&rft.issue=4&rft.spage=452&rft.epage=470&rft.pages=452-470&rft.issn=0098-5589&rft.eissn=1939-3520&rft.coden=IESEDJ&rft_id=info:doi/10.1109/TSE.2008.33&rft_dat=%3Cproquest_cross%3E34473498%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c375t-8840e8085257bc80eb21700374b7385a82550408756f1794cf269d630c4c97783%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=195582226&rft_id=info:pmid/&rft_ieee_id=4527254&rfr_iscdi=true