A System for Performance Evaluation of Embedded Software
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32794
A System for Performance Evaluation of Embedded Software

Authors: Yong-Yoon Cho, Jong-Bae Moon, Young-Chul Kim

Abstract:

Developers need to evaluate software's performance to make software efficient. This paper suggests a performance evaluation system for embedded software. The suggested system consists of code analyzer, testing agents, data analyzer, and report viewer. The code analyzer inserts additional code dependent on target system into source code and compiles the source code. The testing agents execute performance test. The data analyzer translates raw-level results data to class-level APIs for reporting viewer. The report viewer offers users graphical report views by using the APIs. We hope that the suggested tool will be useful for embedded-related software development,because developers can easily and intuitively analyze software's performance and resource utilization.

Keywords: Embedded Software, Performance EvaluationSystem, Testing Agents, Report Generator

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1078267

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2826

References:


[1] Roper, Marc, Software Testing, London, McGraw-Hill Book Company,1994.
[2] Boris Beizer, Software Testing Techniques 2nd edition, New York: Van Nostrand Reinhold, 19901
[3] Bart Broekman and Edwin Notenboom, Testing Embedded Software,Addisson-wesley, Dec. 2002
[4] Dr. Neal Stollon, Rick Leatherman and Bruce Ableidinger, Multi-CoreEmbedded Debug for Structured ASIC Systems, proceedings of DesignCon 2004, Feb, 2004.
[5] David B. Stewart, Gaurav Arora, A Tool for Analyzing and Fine Tuning the Real-Time Properties of an Embedded System. IEEE Trans. SoftwareEng., Vol.TSE-29, No.4, April 2003, pp.311-326.
[6] Ichiro Satoh, A Testing Framework for Mobile Computing Software.IEEE Trans. Software Eng., Vol.TSE-29, No.12, December 2003, pp.1112-1121.
[7] Paul Anderson, Thomas W. Reps, Tim Teitelbaum, Design and Implementation of a Fine-Grained Software Inspection Tool. IEEE Trans.Software Eng., Vol.TSE-29, No.8, August 2003, pp.721-733.
[8] John Joseph Chilenski and Steven P. Miller, Applicability of Modified Condition/Decision Coverage to Software Testing, Software Engineering Journal, September 1994, Vol. 9, No. 5, pp. 193-200.
[9] Robert B. France, Dae-Kyoo Kim, Sudipto Ghosh, Eunjee Song, AUML-Based Pattern Specification Technique, IEEE Trans. Software Eng.,Vol.TSE-30, No.4, April 2004, pp. 193-206.
[10] Ludovic Apvrille, Jean-Pierre Courtiat, Christophe Lohr, Pierre de Saqui-Sannes, TURTLE: A Real-Time UML Profile Supported by a Formal Validation Toolkit. IEEE Trans. Software Eng., Vol.TSE-30,No.7, July 2004, pp. 473-487.
[11] William E. Howden, Weak Mutation Testing and Completeness of Test Sets, IEEE Trans. Software Eng., Vol.SE-8, No.4, July 1982, pp.371-379.
[12] Brad Long, Daniel Hoffman, Paul A. Strooper, Tool Support for Testing Concurrent Java Components. IEEE Trans. Software Eng., Vol.TSE-29,No.6, June 2003, pp.555-566.
[13] Morell, Larry, A Theory of Fault-Based Testing, IEEE Trans. Software Eng., Vol.16, No.8, August 1990, pp.844-857.
[14] John P. Kearns, Carol J. Meier and Mary Lou Soffa, The Performance Evaluation of Control Implementations. IEEE Trans. Software Eng.,Vol.SE-8, No.2, February 1982, pp.89-96.